A Few PCI DSS 2.0 Things Really Worth Knowing

Share it on Twitter  
Share it on Facebook  
Share it on Linked in  

As today marks the official first day of version 2.0 of the new payment card industry data storage standard (PCI DSS), here are few things well worth considering.

The first is that PCI DSS, in general, is such a hassle to deal with that outsourcing management of this whole process to a cloud computing provider is definitely the better part of valor for most organizations. There are plenty of options when it comes to these services, which all pretty much rely on a tokenization process that abstracts the management of credit card data.

The latest news on this front is an alliance between Verifone and the RSA unit of EMC. Under terms of the deal, Verifone, which makes a lot of the equipment used to process credit card transactions, will combine its tokenization technology with similar technology from RSA. The RSA technology is also used by First Data and First Data announced today that it, too, is working with Verifone. So in effect, RSA is trying to create some sort of de facto tokenization standard.

Unfortunately, when it comes to tokenization, the PCI Council hasn't really been all that definitive about how it views this approach to the problem. According to Rob Sadowski, director of marking for RSA, as vendors and customers gain more experience with PCI DSS, they are in many ways moving beyond where the PCI Council is with the standard, something that ultimately will need to be addressed.

Speaking of standards, the folks at Aria Systems would like to point out that in the absence of those standards, the company is making available a "universal token" that any service provider can use. According to Aria Chairman Ed Sullivan, the nice thing about that approach is that a universal token will prevent customers from getting locked into any one provider.


Protegrity CTO Ulf Mattson says that an official tokenization standard won't be coming anytime soon. But in December the council will clarify the definition of tokenization largely because too many vendors are confusing tokenization with encryption.


In the meantime, Protegrity and others will compete vigorously over on-premise and cloud computing deployments of tokenization. Protegrity claims to have the fastest, most distributed tokenization architecture. At the same time, HyTrust, VMware, Cisco, Savvis and Coalfire announced today that they are working on a reference architecture for deploying PCI DSS 2.0-compliant systems on top of virtual servers, which HyTrust CTO Hemma Prafullchandra said is a deployment model that is now officially supported in the PCI DSS standard.

Of course, the best thing about the new specification is that it calls for a risk-based approach to credit card security, which is code for telling people they need to rank their risks and apply levels of security rationally from there. What that really means is that when it comes to PCI DSS, don't let the requirements drive you crazy.