The Answer: Tokenization
Tokenization is one of the strategies that organizations consider when they are looking to protect sensitive data at-rest, in the cloud or in-transit. Tokenization is the process of taking a sensitive data field and replacing it with a surrogate value called a token. De-tokenization is the reverse process of replacing a token with its associated clear text value.
You may be wondering how tokenization differs from encryption. With tokenization, the original data is completely removed, while with encryption, the original data still bears a relationship to its unencrypted form. Tokenization tends to be more flexible in its length and format, compared to traditional encryption techniques. Additionally, tokens cannot be returned to their corresponding clear text values without access to a secured "look-up" table that matches them to their original values. Unlike encrypted values, tokens can be generated so they do not have any relationship to the length of the original value.