To protect consumers, organizations like the Payment Card Industry have instituted security mandates such as the Data Security Standard (PCI DSS), and governments have passed privacy laws. While these mandates and laws require companies to take certain steps to protect consumer and patient information such as credit card numbers and various types of Personally Identifiable Information (PII), chief information security officers (CISOs) are also faced with protecting company-confidential information ranging from employee information to intellectual property. Most always, this means finding the best way to secure many types of data stored on a variety of hardware, from mobile devices to desktops, servers and mainframes, and in many different applications and databases. Further, as some companies have learned the hard way, being compliant doesn't equate to being secure. Breaches have occurred in companies that had taken the necessary steps to pass PCI DSS compliance audits.
Companies typically rely on strong local encryption to protect data. While effective, it does present some challenges. For example, encrypted data takes more space than unencrypted data. Trying to fit the larger cipher text of a 16-digit credit card number back into the 16-digit field poses a 'square peg into a round hole' kind of storage problem with consequences that ripple through the business applications that use the data. Storing encrypted values in place of the original data often requires companies to contract for costly programming modifications to existing applications and databases. What's more, for businesses that must comply with PCI DSS, any system that contains encrypted card data is 'in scope' for PCI DSS compliance and audits. Every in-scope system adds to the cost and complexity of compliance.
To reduce the points of risk as well as the scope of PCI DSS audits, and to provide another level of security, a new data security model-tokenization-is gaining traction with CISOs who need to protect all manner of confidential information in an IT environment.
What is Tokenization?
With traditional encryption, when a database or application needs to store sensitive data such as credit card or national insurance numbers, those values are encrypted and then the cipher text is returned to the original location. With tokenisation, a token-or surrogate value-is returned and stored in place of the original data. The token is a reference to the actual cipher text, which is usually stored in a central data vault. This token can then be safely used by any file, application, database or backup medium throughout the organisation, thus minimizing the risk of exposing the actual sensitive data. Because you can control the format of the token, and because the token is consistent for all instances of a particular sensitive data value, your business and analytical applications continue seamless operation.
Tokenization is an alternative data protection architecture that is ideal for some organisations' requirements. It reduces the number of points where sensitive data is stored within an enterprise, making it easier to manage and more secure. It's much like storing all of the Queens' jewels in the Tower of London. Both are single repositories of important items, well guarded and easily managed.
The newest form of tokenisation, called Format Preserving Tokenization, creates a token - or surrogate value - that represents and fits precisely in the place of the original data, instead of the larger amount of storage required by encrypted data. Additionally, to maintain some of the business context of the original value, certain portions of the data can be retained within the token that is generated. The encrypted data the token represents is then locked in the central data vault.
Because tokens are not mathematically derived from the original data, they are arguably safer than even exposing encrypted values. A token can be passed around the network between applications, databases and business processes safely, all the while leaving the encrypted data it represents securely stored in the central repository. Authorized applications that need access to encrypted data can only retrieve it with proper credentials and a token issued from a token server, providing an extra layer of protection for sensitive information and preserving storage space at data collection points.
Replacing encrypted data with tokens also provides a way for organizations to reduce the number of employees who can access sensitive data to minimize the scope of internal data theft risk dramatically. Under the tokenization model, only authorized employees have access to encrypted data such as customer information; and even fewer employees have access to the clear text, decrypted data.
NEXT PAGE: Tokenization in an Enterprise