Equifax, one of the largest consumer credit reporting organizations, recently suffered a major cybersecurity breach that made headlines and sent American consumers into a panic.
Here’s a brief summary:
- The sensitive information — including social security and driver’s license numbers — of around 143 million American consumers were potentially compromised
- There is a 50% chance that the sensitive information of any American with a credit report has been compromised
- The credit card numbers of around 209,000 consumers were also potentially stolen together with 182,000 other documents containing personal information
This is not the first time Equifax was involved in a massive breach. In 2016 and early 2017, hackers got their hands on W-2 tax data from Equifax and its subsidiaries’ websites.
Understandably, Equifax executives are now under fire for not strengthening their security measures following these earlier attacks.
But what could they have done to minimize the impacts of their current security breach?
Tokenization: Minimizing the Impact of Security Breaches
The Equifax security breach can indeed be blamed on its weak security measures and controls. However, as cyber crimes become more sophisticated, it is becoming increasingly more difficult to protect organizations from every emerging threat. But Equifax could have significantly minimized the impact of the security breach if it had employed tokenization which is deemed more sophisticated and hacker-proof than encryption. Encryption is mathematically derived, and hence can be mathematically underived with enough compute processing power and time.
However, tokenization is the process of replacing sensitive information, such as credit card numbers, with a surrogate set of characters called tokens that have no extrinsic or exploitable meaning or value. By storing and processing tokens instead of the actual, sensitive data, a security breach would have given criminals nothing but meaningless characters. If it had used tokenization, Equifax could have protected consumers’ sensitive information from cyber criminals.
How Tokenization Works
Tokenization involves a three-step process: capturing the information, replacing the information with tokens, and storing the information.
Capturing the information. In a tokenized system, an organization such as Equifax captures sensitive information from their customers. This information may include credit card numbers and other personally identifiable information (PII).
Replacing the information. The organization utilizes a tokenization system that is often provided by a third-party solution provider. The tokenization system replaces the information captured with tokens. Each piece of information is represented in the organization’s system by a specific token.
Storing the information. The sensitive information is then stored in a secure vault, often managed by the solution provider, thus adding an extra layer of security. The secure vault matches the original, sensitive information with each token that replaced it, making it possible for the organization to view and utilize the original data and maintain control over it.
If Equifax had used tokenization, the only information stored in its internal databases would have been tokens. Its customers’ sensitive information would have been stored in the tokenization provider’s secure, encrypted vault which requires a very specialized and highly protected key. So even if Equifax’s databases were compromised by a breach, its customers’ information would have been safe with the tokenization solution provider.
Furthermore, the responsibility and burden of safeguarding the sensitive information — as well as improving the security measures needed to protect them — would have been on the tokenization solution provider, not on Equifax.
A Secure, Cloud-based Tokenization Solution
Liaison’s tokenization solution stores sensitive data in its encrypted cloud, minimizing the impact of security breaches and other cyber crimes. Delivered using the award-winning ALLOY™ Platform, Liaison’s cloud-based tokenization technology enables credit card, payment, healthcare, and personal identified information to be fully protected. It also manages the competing objectives of access and security by substituting sensitive data throughout enterprise systems with format-preserving tokens. This enables enterprises to avoid the need for back-end system modifications and allow data analysis operations to continue as usual.
With Liaison’s tokenization technology, enterprises can minimize points of risk by removing sensitive information from their systems. Enterprises can also minimize the costs associated with compliance with regulatory standards such as the Payment Card Industry Data Security Standards (PCI DSS) or building complex on-premise tokenization solutions. Contact our data experts to learn more about our tokenization solution.