I continue to get a significant amount of e-mail asking about the Sarbanes-Oxley Act, so I thought I would provide an update on our progress toward compliance. Since the last time I commented on this subject, we have come quite a way.

A few months ago, I attended a meeting with representatives from networking, data centre operations, database and application engineering, Unix and Windows NT administration and other groups to discuss control objectives for each area.

We mainly used Cobit (Control Objectives for Information and Related Technology) to help identify our controls. It provides a framework, guidelines and some implementation tools to steer companies in the right direction.

We also needed to think about which systems would have to be looked at. Our company has over 500 production Unix servers and several hundred NT servers running various applications. There was no way we could test over 700 servers. Since Sarbanes-Oxley focuses on financials, we came up with a list of systems that affect our financial reporting. Those 700-plus servers dwindled to just under 100. We then categorised them by application to better manage the workload.

Once we formalised the objectives, the testing was fairly straightforward. For example, one control objective within the Oracle database area might say, "Users do not directly access the Oracle database using the application ID or a generic account." Certain parameters within the Oracle database configuration file, as well as the Unix user accounts, would have to be reviewed to determine who had access to the server and the database. Given that we have dozens of Oracle servers in our environment and 32 tests to perform, it made sense to run a script on each server that would obtain the information from configuration files.

For Oracle, most of the test results were within either the init.ora or the listener.ora file. The script took some time to develop, but in the end, we had an easily repeatable method for testing our Oracle environment.

For the Unix servers, a control objective might be, "User passwords must be changed every 90 days." The test for this objective would be to review the /etc/default/password file for every Unix server and see if the "MAXWEEKS" parameter was set to 90 days. With over 25 control objectives for the Unix environment and dozens of servers to test, we developed another script. Tests included grabbing configuration files, checking file permissions, listing patches and installed applications, and running commands to obtain system information.

We'll have to repeat this process every year, so it's imperative that we come up with a standardized method to test our control objectives. Scripts are one method of ensuring that we're consistent.

Tracking our work
To keep track of our work, we developed standardised spreadsheets for each IT control area. For each control objective, we identified the implication of a particular objective not being satisfied, the testing procedure and recommendations if it failed. We also included a column to register test results.

After the testing was completed, each person responsible for an area of testing created what we called "gap sheets," which identified failed control objectives. The managers then met to go over the gap sheets and plan for fixing the gap or determining what are termed "compensating controls."

In the test for password expiration, for example, compensating controls might be that users are forced to use SecurID two-factor authentication to access Unix servers and that the system is locked down to prevent users from directly logging in via their user accounts. Compensating controls have to be used carefully, though, since auditors could suggest that we're making excuses for not doing the work needed to make the system modifications.

Although I've mentioned only the Oracle and Unix areas, several others were identified in relation to IT security. For example, incident response, security policies, log monitoring, intrusion detection and encryption have their own control objectives. Unfortunately, because I represent the security department, I was restricted from performing the security tests, since it could be argued that I'm biased.

We have completed our testing and are working hard to retrofit our systems, procedures and policies to comply with the identified control objectives. We're finding that we can't just arbitrarily make changes to satisfy a Sarbanes-Oxley test objective, especially in a production environment that generates thousands of dollars per minute in revenue.

For example, if an objective was to ensure that only necessary applications were installed, the response might be to remove unnecessary applications. Sometimes, by removing an application, associated system libraries also get removed, which may affect other applications or general system operation. Unfortunately, we don't have a robust development environment that directly mirrors our production systems, so we can't easily test such things ahead of time.

Changing our environment to meet the control objectives will be very time-consuming, frustrating and critical for our company. In December, the official audit will take place, and we're fairly confident that if we're tested according to the current control objectives, we'll do fine.

This week's journal is written by a real security manager, "Mathias Thurman," whose name and employer have been disguised for obvious reasons. Contact him at [email protected]