It couldn't get much worse for TJX Companies. The breach of the retail giant's credit card payment systems in January was bad enough. Then TJX's Form 10-K filing with the Securities and Exchange Commission in late March revealed that a total of 46.5 million card numbers had been stolen, making it the biggest data breach ever.
A review of the 10-K -- and some reading between the lines based on media reports -- shows some glaring holes in basic security for handling credit card data, and non-compliance with industry standards. There's speculation, based on the scope and type of breach, that this could've been an inside job. Either way, stricter adherence to some fundamental principles of access management and closer auditing for standards compliance -- most notably compliance with the Payment Card Industry (PCI) Data Security Standard -- could've prevented the breach altogether. The PCI DSS, which has 12 basic requirements, isn't perfect, but it provides a simple roadmap for securing card operations. In this tip, we'll review what other organizations can learn from TJX's missteps.
Too much data, kept too long
First, TJX insecurely stored transaction records and customer information after it had served its business purpose, violating PCI DSS requirement 3, which calls for the protection of stored cardholder data. According to the 10-K, TJX still doesn't have a complete picture of what customer data was stolen. This is partly because some of the data that might have been stolen was later deleted in the normal course of business. Another reason is that techniques used by the intruders left no tracks, making it impossible to identify other data that might have been pilfered.
But the 10-K states that data stolen in 2005 may have represented up to half of all transactions at stores in the U.S., Puerto Rico and Canada between Dec. 31, 2002 and June 28, 2004, and was stored in its Framingham, Mass. facility. Requirement 3.1 of the standard clearly states data is only to be held for as long as necessary for business, legal or regulatory purposes -- which might not have been the case here.
Additionally, section 3.2.1 says that Track 2 data -- cardholders' names, primary account numbers and service codes, contained on the magnetic stripe of credit cards -- shouldn't be stored at all. The 10-K filing says that such data wasn't stored on its Framingham systems after Sept. 2, 2003, hinting there was a window where such data may have been stored.
Plus, encryption controls may not have been sufficient for customer data, either stored or transmitted to outside parties. PCI Data Security Standard section 3.4 cites that encryption by "strong cryptography with key management processes" as one of four approaches to safely obscuring customer data.
Even more damaging is an indication in the filing that some of the data stolen in 2006 from the Watford facility in the UK was grabbed in transit during the payment card approval process. This includes Track 2 data, none of which was encrypted during transmission. This violates section 4.1, which mandates the encryption of customer data in transit over the Internet. It also calls for using WPA or WPA2 to encrypt wireless transmissions of such data. There has been some speculation the intruders may have used a hole in TJX's wireless network.
Who had the keys?
But there's a wrinkle to the encryption part of the story. From the 10-K, it appears that while TJX failed to encrypt data in transit, data at rest was encrypted. However, the company believes the intruder may have had access to its encryption keys. If that was the case, and the heist was an inside job, the issue was a lack of internal controls rather than a lack of proper encryption.
Section 3.5 of the standard clearly calls for secure storage and limited access to encryption keys, but requirement 7 also says that access to cardholder data should only be granted on a need-to-know basis and that systems should otherwise be set to "deny all." While the 10-K doesn't specifically address access management issues, the hint of an insider threat is a sign that basic data access control procedures weren't thorough enough.
What also needs to be reviewed is the physical access granted to employees at the Framingham and Watford facilities. PCI Data Security Standard sections 9.2 and 9.3 address procedures for badges and visitor logs. Were these carefully followed? The 10-K doesn't say.
Yet the 10-K clearly states that the malicious access dated back to 2005, but wasn't discovered until December 2006. That's at least a year and a half that the intruders were dancing through TJX's systems. What took so long for it to be discovered? Section 10.5 mandates secure audit trails that can't be altered. The 10-K repeatedly refers to "technology used by the intruder" was basically used to cover their tracks. Here again, audit logs clearly weren't up to par.
Sections 11.3 and 11.4 require regular penetration testing and the use of intrusion detection systems. Section 11.5 requires deployment of file integrity monitoring software. Was any of this in place at TJX? The 10-K doesn't mention any of these safeguards, but it's possible that none were implemented.
While compliance doesn't necessarily equal security, in the case of the TJX breach there wasn't enough of either.
About the author
Joel Dubin, CISSP, is an independent computer security consultant. He is a Microsoft MVP, specializing in Web and application security, and the author of The Little Black Book of Computer Security. He has a radio show on WIIT in Chicago on computer security and runs The IT Security Blog.