The debate is over. It’s time to collaborate!
The value of tokenization is indisputable. We’re seeing, for example, that tokenization is helping a global online retailer reduce its PCI DSS audit scope by more than 90%, with like cost and resource savings! Tokenization isn’t just for the big guys. Even medium-sized retailers are reducing the complexity and costs associated with PCI DSS – thanks to tokenization.
Tokenization is also bringing tremendous value to organizations in other industries as well – particularly hospitality, financial services and health care. Any organization that wants to meet best practice in risk management can greatly benefit from tokenization. Not just tokenization of credit cards, but any personally identifiable information (PII) or protected health information (PHI)PII. And that’s everything from data included in employment records to medical files to insurance claims and so on. Can you think of any organization that doesn’t store or transmit one of these types of information? I can’t.
That’s the good news. The not-so-good news is that as the value of tokenization is recognized, the race is on to develop tokenization wannabes and even home-grown versions. Are these tokenization solutions being tested against any standards? No.
According to John Pescatore, vice president at Gartner, since standards aren’t in place for tokenization (as they are for encryption), there is nothing against which to compare it to ensure it’s done correctly. And Ramon Krikken, also an analyst at Gartner, had a great deal to say on this topic at the recent RSA Conference. For example, he called for a standards group similar to the PCI SSC to lead the effort.
And at the recent Electronic Transactions Association (ETA) Conference, Paul Garcia, chairman, president and CEO of Global Payments, called for the ETA to create a committee to explore tokenization standards.
So it seems we all agree that we need a tokenization standard, but we need to make sure that it is a universal standard that extends across geographic , corporate, industry and data boundaries. But it already appears that we’re following the pattern of many standards before whereby there are “multiple, competing standards” (yes, the oxymoron) and lots of wasted time and energy working towards a winner.
- The Accredited Standards Committee X9 is has begun working on a standard to define tokenization requirements related to credit card data in the financial services industry.
- The Hospitality Technology Next Generation Payments Workgroup just issued a tokenization standard for credit card data for use within the hospitality industry. They use the term DataProxy for a token and require it to be MOD-10 compliant.
- The Payment Industry Security Standards Council’s (PCI SSC) Scoping Special Interest Group (SIG) is working on definitions and the application of tokens as it relates to the PCI Data Security Standard (DSS).
So we’re off and running in the direction of standards, multiple ones just as history has taught us. Though we need to join together now to establish them for not only credit card data, but also look towards the future to address other data, globally, such as PII. That’s why we proposed a Tokenization Standards Organization at last month’s RSA Conference. We’re calling on all vendors in this space to collaborate (yes, competitors do collaborate!) to develop a set of global specifications on tokenization.
I encourage you to share your thoughts on how best to get a universal tokenization standard accepted around the globe. You can comment below or contact me at firstname.lastname@example.org.
For a secure, tokenized world,