This tip is part of the SearchFinancialSecurity.com Security School lesson, Preventing data leaks. Visit the lesson page for additional learning resources.
Although information security professionals intuitively understand that sensitive information is constantly transported throughout every organization, it is not always clear whether there is a way to manage that flow. Today data leak protection (DLP) tools are being deployed in many types of enterprises, including financial services firms, to avoid the problems that occur when data travels beyond its intended boundaries.
As is often the case with any emerging product category, there is significant industry skepticism of DLP, with plenty of questions that need answers:
- Can one effectively find leaks in such complex networks?
- How good are DLP tools at identifying sensitive information?
- What is the overhead on the front end (e.g. for classification) and on the back end (e.g. for incident response)?
Fortunately, most (if not all) DLP vendors recognize the need to "try before you buy," and will provide proof-of-concept tools to deploy in your environment. In this tip, we'll explore best practices for testing and evaluating DLP products.
Find a high-utilization network pipe where most of the activity crosses zone boundaries. This way, it's fairly easy to discern when a sensitive operation is occurring. Typically, you can use one of your hopefully-fewer-than-a-dozen main Internet connections (if you have more than that, then pick one with email).
Put the device on a span port or network tap that provides passive monitoring capabilities to ensure that there is no need to worry about performance degradation or availability issues. And then, just watch.
It is possible that there won't be much to see. But with users being users, and information wanting to be "set free," you are much more likely to see plenty of activity -- much of it legitimate. Personally identifiable information (PII), communications with clients, "boomerang" work (that comes back to you at your home PC) and sales and marketing plans are all likely to surface.
When you see the results, take a step back and remind yourself that the use of sensitive data is one of the benefits that IT provides to your organization. Then take a look at the information flow happening in your environment. Highly distributed and/or decentralized environments will have the toughest time distinguishing the appropriate from the inappropriate.
Throughout the DLP product-testing process, keep the following points in mind:
- Get real. Determine the extent of your situation. While unlikely, it could be you are underwhelmed by the nature of the information the DLP tool provides and overwhelmed by the potential workload requirements that come with implementing and managing it. In this case, a DLP tool might not be a good fit in your organization.
- Get real. Again. Enterprise data leakage problems are often considered unsolvable because IT infrastructures are too complex. This reality check demonstrates the capability of a product to actually address the problem.
- Kick the tires. Test the performance of your DLP tool to see if it can handle your environment's throughput. Remember, it's common for a large organization to have millions of DLP "fingerprints" and data detectors (i.e. techniques used to find known social security and credit card numbers). When compared with the few thousand signatures used by a typical intrusion detection system (IDS) and the performance issues IDS products have been known for, there's no question performance should be considered carefully. Be aware of the techniques used to meet performance requirements. More often than not, success will be found with strategic use of filters to identify only the most suspicious traffic.
- Know the flow. DLP tools provide a unique opportunity to understand how information is used throughout an organization. How many companies utilize a network-oriented data flow diagram? Not the ones that developers use, but one that maps how content flows throughout your organization -- what the high-use applications are, who the users are, and where the hot spots are. A DLP tool should illuminate these things (albeit in somewhat rudimentary fashion). Knowing how data flows throughout the organization is the essence of risk management. To understand the data-usage patterns is to be able to do a better job as a risk manager.
There are no huge hurdles to overcome technically or architecturally with DLP. In general, the tools are passive; just plug them in to your tap or span port. The products themselves are maturing quickly; at this stage, it is simple to identify PII and credit card numbers, as well as universal "acceptable use" issues. The more sensitive, enterprise-specific content will take some tuning. From a risk perspective, it is beneficial for organizations to know about how data flows throughout the enterprise so proper protective measures can be put in place.
About the author:
Pete Lindstrom is senior analyst with Midvale, Utah-based research firm Burton Group. His areas of expertise include security metrics, risk management, Web 2.0 and SOA security and safeguards for other emerging technologies. Previously he helmed his own research group, Spire Security, and also worked as an auditor and security architect.
This was first published in April 2008