In the financial-services industry, a hot topic of debate in the areas of security and compliance involves encryption...
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
(both for stored data and data in transit) versus tokenization. With compliance pressures such as the Payment Card Industry Data Security Standard (PCI DSS) and FFIEC information security examination requirements (which includes an extensive section on encryption and data protection), organizations are looking for ways to best protect cardholder and other sensitive data. "End-to-end" encryption and tokenization are promising solutions, but each has definite benefits and shortcomings, which should be carefully considered before making any technology investment.
Let's start with encryption. End-to-end encryption means encrypting data at rest and keeping it encrypted in transit until it reaches the final destination, where decryption occurs. End-to-end encryption, when implemented properly using well-known and trusted algorithms, can likely provide the greatest level of data confidentiality.
For example, payment card PINs used by card processing firms are often encrypted and decrypted within a specialized hardware security module (HSM) using 3DES or other strong algorithms. These modules are frequently kept under physical lock and key and are only accessed by parties with shared administration duties. In a case like this, the likelihood of data compromise is somewhat lower. In another scenario, credit card data is encrypted at a point-of-sale (PoS) terminal using 3DES, AES, or other algorithms, and isn't decrypted until ultimately reaching the acquiring bank for processing. As another benefit, encryption solutions may be more likely to integrate with existing PoS, network and database solutions as well as financial applications, as it has been around longer.
Unfortunately, end-to-end encryption is not simple to implement. To begin, there is often confusion as to what "end-to-end" really constitutes. If the financial data is processed at multiple stages in transit by different operating systems and applications, there may be several cycles of decryption and re-encryption, which largely defeats the purpose of end-to-end encryption since the data is most vulnerable during these operations. In some cases, the data or a portion of it is also needed for business reasons; a common example is the retention of payment card data for recurring charges and chargebacks (refunds). In addition, management of centralized encryption key stores is complex and costly. In these scenarios, the use of tokenization technology tends to be more practical.
Tokenization technology replaces payment card data or financial accounting records with a unique value, or token, after initial authorization or processing has taken place. The technology is being heralded by some as a solution to encryption's inherent implementation and management complexity, and tokenization solutions do tend to be more flexible and simple to set up. With tokenization, the actual financial data is never actually transmitted, excluding the original transaction or use, in many cases. The token can be stored indefinitely, allowing this value to be retained and leveraged for ongoing use in transactions or to access the actual data stored elsewhere at any later date. In most scenarios, the tokenization technology is outsourced to processing and data-handling firms, which may reduce the operational burden of managing security, in some respect.
However, this outsourcing aspect can be a double-edged sword. Many larger financial organizations will undoubtedly be hesitant to outsource security management of this sort, and they may not be able to in any case due to specific policies, technology requirements that aren't compatible with tokenization, and the difficulty in locating and "tokenizing" all financial data within the environment. For some large organizations, simply encrypting entire databases or storage environments may keep financial data protected, even data that administrators don't know exists. Tokenization relies on explicit modification of the data itself to work, and removing these types of encryption controls could inadvertently lead to exposure or data compromise. For these reasons, tokenization currently seems best suited to smaller organizations with more flexible requirements and more granular control over their data -- where it's stored, how it's used, and who is managing the tokens and token processing/storage applications.
Looking ahead, the industry likely won't choose one technology over the other. Despite their strengths and weaknesses, there is likely plenty of opportunity for encryption and tokenization to coexist. If in-house tokenization applications are employed, then the tokenizing servers and storage areas will still need encryption for effective security. And since tokenization may not cover 100% of the applications and use cases for sensitive financial data, encryption will still likely be required. Ultimately, there's no easy solution. Both technologies will require management and maintenance, whether in-house or outsourced.
About the Author:
Dave Shackleford is director of risk and compliance and acting director of security assessments at Sword and Shield Enterprise Security Inc., and a certified SANS instructor. He was formerly CSO at Configuresoft Inc. and CTO at the Center for Internet Security, and has worked as a security architect, analyst, and manager for several Fortune 500 companies. In addition to these roles, he has consulted with hundreds of organizations for regulatory compliance, as well as security and network architecture and engineering.