Using Tokenization for Superior Data Security

Email     |     Share  
1 | 2 | 3 | 4 | 5 | 6 | 7
Next Using Tokenization for Superior Data Security-3 Next

Analytical Value

The data type and length of original data is preserved during tokenization, while a small number of characters may remain original or 'exposed' and tokens remain consistent for pattern matching and analysis. This sufficiently protects data while allowing business insight and processing to continue as usual; it is insufficient to be attractive to a hacker or malicious user. Requiring minimal modification to existing systems, vaultless tokenization seamlessly provides continuous data protection from the moment of capture to storage in the data center – and everywhere in between.

With the launch of ApplePay for secure online and mobile payments, tokenization has become a hot topic in data security right now. Born from a need to secure PCI data, tokenization technologies substitute real data with fake data or a 'token' that has no value to a thief.

Tokenization can be used to de-identify any structured sensitive information, as defined by PCI DSS, HIPAA and NIST and others, such as credit cards, names, social security numbers, addresses, and any other PCI, personally identifiable information (PII), or protected health information (PHI) data. Tokens have the same appearance as the data they replace, meaning they can be used for processing and analytics in existing systems with minimal modification. Recent statistics from CSIS put the cost of breaches last year at around $400 billion with over 100 million people affected globally.

As data creation continues exponentially, the cost, risk and sophistication of breaches increase, and laws governing data security are debated and reformed globally, all industries are fast turning to tokenization to secure their data. But not all tokenization is created equal. Modern vaultless tokenization delivers the most secure, transparent and flexible solution with the greatest ROI to organizations looking to protect and realize the value in their sensitive and customer data. In this slideshow, Protegrity explains five characteristics of tokenization that every organization can benefit from.

 

Related Topics : Unisys, Stimulus Package, Security Breaches, Symantec, Electronic Surveillance

 
More Slideshows

IT security careers The Most In-Demand Security Jobs and How to Get Them

Security professionals are in demand right now, and entry-level security jobs generally fall into either an engineer or analyst role. Find out more about required skills and career paths. ...  More >>

142x105itbeusasecurity2.jpg 9 Predictions for Cybersecurity’s Role in Government and Politics in 2017

Experts predict how cybersecurity will affect and involve our government, policies and politics in 2017. ...  More >>

Shadow IT Security How Risky Behaviors Hurt Shadow IT Security

Examine some of the concerns involving shadow IT security and some of the riskiest behaviors, applications and devices. ...  More >>

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.