How to classify security for enterprise file folders

Many organizations provide default access to network files and folders, meaning everyone has access to everything. However, the open access model does not address the complexity that comes with multiple levels of information confidentiality. In this tip, Xin Hu shares strategies on how to define access to certain files and folders without disrupting business processes.

Establishing effective enterprise file permissions means providing users access only if they need to know the information...

in the files and have the approval or clearance to obtain it. It is easy to implement such restrictions in certain localized file structures. However, it is much more challenging to implement across numerous enterprise-wide file repositories. This tip will provide strategies to do just that.

Problems with ad hoc file access
The ad hoc access control scenario originates from applying open access as the default file permissions. File structures are often open to all users who have been authenticated to the internal network, a broad group typically known as "Authenticated Users." This open access approach assumes that employees are trusted users and hence deserve full access to the internal network. It's a user-friendly approach because users are guaranteed to have access to all the files they need.

Yet this access model does not address the complexity in multiple levels of information confidentiality. Access to certain file structures may be restricted as needed, but adequate access restrictions across the enterprise file infrastructure cannot be guaranteed. While selective locations may be secured, an open access strategy means the majority of the file structure remains open. It's an acceptable approach for smaller companies in which most users have similar access requirements, but as these organizations grow it's likely that file access requirements will become more complicated, leading to file structures that are not adequately restricted.

A side effect of the open access approach is that many home directories could potentially inherit user groups with large populations inadvertently. Home directories are set up in the shared file structures with the intention to provide personal access with enterprise backup services. Users often believe they can store confidential information in their home directories, but if other users poke around the network, they may be able to target their colleagues' personal home directories and access files they should not have. It is important to keep the home directories private. This is an example of "security by obscurity" failure.

How to limit access to files containing confidential information
The following controls must be implemented to achieve effective file permissions in an environment with open access as the default.

Know which users have access to which files -- There are two approaches to denying anonymous access at particular folder locations without affecting business functions. The ideal approach is for folder owners to identify the approved users and their levels of access, based on team collaboration requirements. This approach aligns with the concept of setting up the appropriate users' access to adequate levels from the outset. However, if the collaboration effort spans multiple teams across the company, this task may be too daunting.

The alternative approach is to use automated tools. Many logging tools can provide information on user access during a specified timeframe. If the log can identify the users who actually used a defined folder, they can automatically be grandfathered into new user groups that have permission to access the folder. This method can quickly reduce the access from large user groups to smaller ones. The drawback of this approach is that the grandfathered groups may contain users who do not have a need to access the folder locations. By exploring there, they have been flagged by the logging tool. Therefore, after the automated tool performs the initial grouping, it is still necessary to further restrict the access to only the approved users.

Classify files so that confidential locations can be identified -- If the enterprise's file structure is too large to be given security classifications in one swoop, it may be necessary to grant access based on the levels of confidentiality. The first step would be to perform an information classification scan to determine the classification ratings. Using a phased approach based on formal classification labels defined by the company, project managers can restrict access first to file structures containing information with the highest confidentiality rating, and then to file structures containing information with second-highest confidentiality rating, and so on. This exercise can be discontinued upon reaching the rating level of generally open information.

For certain repositories, it may be feasible to classify information and verify the access restriction adequacy at the time that files are uploaded. Alternatively, periodic scans can be employed to detect the classification labels and the access restriction in place. File or folder owners can then be notified to rectify the situations or identify false positives.

Publish policies on using file repositories and record retention -- It is necessary to develop and publish a policy on which file repositories may store certain types of information, as well as a record-retention policy. For example, temporary work-in-progress files and final versions may be stored in different document repositories. How long certain records need to be retained should also determine where they are stored.

Avoid affecting business functions -- The key here is to know who needs access, and thus not to revoke legitimate users' access. If tools are used, thoroughly test them before deploying in a production environment. Generic accounts are often used by automated programs to write files into certain folder locations. If their access is denied, business processes would be affected. In financial services firms in particular, generic accounts are sometimes used only for quarter-end or year-end processing, and therefore, they are often forgotten during the classification process. Special care is needed when handling generic accounts.

Denying anonymous access as a default
The methodology of denying all but approved users has proven to be effective in many security domains, such as firewalls and ports. Many newer file storage mechanisms offer this type of access as the default setting. For example, Microsoft's SharePoint collaboration platform provides the option of granting or denying anonymous access. If a file location contains confidential information, the site administrator can deny anonymous access and grant access for approved user groups and individual users.

Denied anonymous access should not be taken to the extreme. Usability for access administration is still an important factor when designing lockdown methods. Access restriction should not be applied at the file level as a widespread practice in order to avoid complexity, regardless of whether the access is managed by an administrative team or by the owners themselves.

Permission sets and user groups can be used to simplify the access administration. For instance, in EMC Corp.'s Documentum file repository, the world-read access could be set to "none" to deny anonymous access. Yet if a folder structure is used by a common user population, its permission set can be applied at these locations so that user changes can be propagated through the permission sets and user groups without having to make the changes at each file or folder location.

The starting point for effective enterprise file permission is to set the default as denying anonymous access. From there, a balance between administration convenience and the extent of lockdown should be reached within the boundary of the enterprise information classification policy and file storage policy, but without making access management burdensome on users or business processes.

About the author:
Xin Hu, CISSP, GWAS, is a senior security analyst for a major financial services company. She specializes in Web and application security, internal and external security assessments, and unstructured data security.

This was last published in March 2008

Dig Deeper on Data classification methods and guidelines

Start the conversation

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.