The Importance of Tokenization
Tokenization protects data in a database application (typically, Credit Card Numbers and Social Security Numbers) while still enabling workflows that were in place before the tokenization was implemented.
Limitations of Tokenization
Tokenization is a two-way process. Given authorized access, it’s possible to get the original data back from the tokenized data. Therefore, access to the database that maps the tokens to the original data must be properly secured.
It is important to note that Tokenization does not work with variable length fields.
Tokenization usually has to be implemented in an application server or a separate "API" server. This is more difficult than simply turning on Transparent Database Encryption.
Vidder’s Tokenization Recommendations
Vidder recommends using a highly secure tokenizer with well-formed random numbers for: Credit Card Numbers, Social Security Numbers, and all other sensitive, fixed-length fields stored in a database. Tokenization provides better security than Transparent Database Encryption because database admins do not have access to the real data.
Vidder recommends using Tokenization instead of Format Preserving Encryption because Tokenization meets PCI compliance standards.