Stop for a moment and imagine what it would be like if all of the sensitive data in your company suddenly went away. It wasn't stolen; your company just found a way to operate without needing to keep that sensitive data on hand. Sounds pretty sweet, right?
For everyone in the payment lifecycle, the sensitive data our firms need to do business is like a giant albatross around our necks. We need to protect it, constantly monitor who has access to it, and we live in constant fear of it getting stolen. Financial-services firms such as card issuers and acquirers have it worst of all -- we have a vested interest in making sure our merchants are protecting the data, but we often don't have direct control over whether or not they do.
So it's no wonder a technology hitting the scene that promises to make all these headaches go away would get a lot of attention. While we're all struggling to get and stay compliant with the PCI Data Security Standard, the idea that we could install some technology that reduces the stress of protecting sensitive data has quite an appeal. And this is exactly what tokenization promises to do.
What is tokenization?
To see how tokenization works and why it's useful, it helps to compare how a typical payment transaction currently works versus the ideal of a fully tokenized scenario. When a customer goes to a company and hands off his or her card for authorization, the default scenario is that the merchant needs to keep the cardholder data on file to perform a variety of functions. For example, the merchant needs to keep a record of the account to settle transactions, process recurring payments (like at a gym), modify or update the transaction amount based on instructions from the customer (such as when a customer wants to add a tip to a restaurant bill), or issue refunds.
In this case, the cardholder data is necessary for a company to do business. But while it's necessary, it also carries a serious compliance burden: much of the PCI DSS speaks directly to the requirements related to that data storage.
By contrast, tokenization attempts to minimize the amount of data the business needs to keep on hand; in this case, by replacing the cardholder data with a "token" -- a randomly-generated value the merchant can use instead of the primary account number (PAN). Since the token is not a PAN, and can't be used outside the context of that unique transaction with the merchant, it doesn't have the same high level of sensitivity that a PAN carries.
In a tokenization scenario, the organization outsources their payment processing to a service provider that provides a "tokenization option," such as Shift4 Corp., Electronic Payment Exchange, Merchant Link or Braintree Payment Solutions. The service provider handles the issuance of the token value and also handles the heavy lifting of keeping the cardholder data locked down. Alternatively, a more in-house approach might leverage a product like nuBridges Inc.'s Protect to bring the service-provider functionality on premises.
Pros and cons of tokenization
The relative benefits of a tokenization scenario should probably be pretty clear for folks who've been worried about complying with the PCI DSS. Requirements like 3.4 ("Render PAN, at minimum, unreadable anywhere it is stored…") go from being an "Oh my gosh" to a "Who cares." Why? Because the token isn't a PAN, and once you make the switch, you're no longer processing PANs, that requirement, as well as numerous others in the PCI DSS that target data storage, ceases to apply.
From an integration standpoint, companies offering these services are heavily incented to keep complexity down because it enables them to sell to smaller merchants and retailers with limited in-house technical expertise. This is good news for larger organizations as well. Now, no integration is ever truly "seamless," but since the majority of changes are on the backend (service provider) side, changes to the merchant environment should be relatively few.
Given that, if you're like many organizations, deploying a tokenization solution can be a more cost-effective way to meet PCI requirements than implementing a host of technical security controls around data storage. While there are fees associated with the implementation of a tokenization solution, the reduced scope of compliance and the reduced need for storage-related technical controls is likely to wind up a net gain.
But just as there's no such thing as a free lunch, there's also no panacea -- at least not in information security. In most scenarios, it's the merchant who supplies the cardholder data to the service provider in order for the tokenization to occur. This means the merchant does have a role in the transaction flow. And because the PCI DSS applies to everyone who stores, processes or transmits the data, they still have compliance obligations. While it's certainly true that those compliance requirements are less when dealing with tokens versus live PANs, organizations still need to make sure they comply with the requirements designed to protect data in transit, at least for the machines and processes involved in the transaction before tokenization occurs.
About the author:
Ed Moyle is a manager with CTG's Information Security Solutions practice and a founding partner of consulting firm SecurityCurve. He is co-author of "Cryptographic Libraries for Developers" and a frequent contributor to the information security industry as an author, public speaker, and analyst.
PCI DSS COMPLIANCE HELP
Introduction: PCI DSS compliance help
Combining compliance efforts to manage PCI DSS
Using ISO 27002 to achieve PCI DSS compliance
Network isolation as a PCI DSS compliance method
Tokenization and PCI compliance
Use SHA to encrypt sensitive data
Building a framework-based compliance program
Align an info sec framework to your business model
This was first published in May 2009