Click on our Sponsors to help Support SunWorld [SunWorld] February 1999 [Wizard's Guide to Security by Carole Fennelly] [Next story] Audits from hell [Table of Contents] Find out how to avoid those audit nightmares [Search] [Subscribe to SunWorld, it's free!] ------------------------------------------------------------------------ Abstract Audit. The very word produces groans from system administrators at any company you care to name. Quite often, an audit is more intrusive and consumes more resources than a hostile break-in. This month, Carole examines some audits she's participated in to learn from how they went wrong -- or right. (2,800 words) ------------------------------------------------------------------------ he typical corporate audit is controlled by a special audit department that either performs the audit, or (more likely) contracts a "big company" (one of the Big Five accounting firms) to do it. Quite often, the big company subcontracts the work to a small company with greater technical expertise. The small company would be unlikely to get the contract directly because the client company wants some form of certification. Usually, the closest it can come to certification is to state that "Big company says we're secure." An interesting article written by Tan of The L0pht proposes a method for certification based on the Underwriters Laboratories Standard. (See Resources.) An audit often becomes a political contest between the audit department and the information technology (IT) department. Ensuring a secure architecture becomes almost secondary. Generally, about a month is spent attending meetings and writing memos to explain, clarify, or refute the audit findings. Sometimes, security improvements are actually implemented. Please note that such an unproductive scenario is not always the case -- nor should it be. Hopefully, the real-world experiences I offer here will help to prevent hellish audits from happening in the future. Ugly audits Each of the following events actually happened at some point during the 15 years I've been in this business. There have been many others, but these examples demonstrate some major points. Case #1: Auditor with a tool, but no clue An internal audit department performed its own audits, believing it could save money by having its people certified and buying an auditing tool. I think the auditor in this case had received mainframe certification about 20 years back. The department on the receiving end wasn't very interested in security, much as I tried to change that. When the audit was announced, I was asked to cover the exposures. The auditor had a tool he needed installed and run on all the servers. The tool turned out to be a commercial version of COPS and flagged insignificant items with the same weight as major problems. I hoped to use this as an opportunity to implement better security practices without getting my department in too much trouble. The auditor and I spent one minute reviewing the "root can log in from anywhere" exposure and two hours on the user directory permission problems. Some items flagged as exposures really depended on policy. When I asked to see the policy, I was told that, as a consultant, I could not. I spent two hours every day for three weeks trying to teach the auditor Unix security. When he announced that he wanted to run the tool on the production systems with me, I stalled. Since the systems were downtown, I said I would have to load the software over the network, which could slow down production. We would have to schedule this out of hours. I went back to my desk, loaded and installed the software on the production systems, and fixed most of the problems. When we met to do the run "together," he never noticed that I did a tar -tvf, or that the software was already installed. The upshot of this is that neither one of us really improved security, despite my minor corrections. Having to explain every item the audit tool reported on was so time consuming that I quietly fixed problems and reran the report. If I had left it alone, upper management may have dictated the security overhaul I originally wanted. Case #2: Unauthorized audit An organization within a company brought in a small company to perform an audit of its Web server. The corporate audit department wasn't involved. The outside company produced a report that extolled the wonderful security of the Web server it was contracted to audit. (The audit department later found this to be less than true.) In an apparent excess of zeal, the outside company decided to audit the Web server of another organization within the company. Here's what happened: * The organization that paid for the audit saw this as an opportunity to play politics and seize control of the other Web server. * The target organization, which had already acknowledged the need to improve security, had to refute the charges and retain control of its Web server. * The audit department had to establish that it was the only organization authorized to perform interdepartmental audits. My job was to shoot down the report. It wasn't hard. Despite the outside company's pledge to provide straight answers, the report was full of smug innuendo and few verifiable facts. It was clearly biased to make the paying organization look good. While there were indeed security problems on the system in question, the pathnames were incorrect and statements were made based on a SunOS operating system rather than Solaris. It was easy to see that they were false statements. The most damning statement was the auditor's claim to having downloaded software onto the corporate Web server. The company that performed the audit wasn't local and wouldn't make a special trip to discuss the report or provide detailed documentation. Because they claimed to have downloaded software onto the system and considering that the audit department had no contract with them, a decision was made to reload the system from scratch. But this procedure would cause about two days of downtime for the corporate Web server and would attract the attention of the CEO. So in order to keep the whole mess quiet, the organization that contracted for the audit agreed to buy a new system for the attacked organization. This way the old system stayed up until the new one was ready, limiting the amount of downtime. Case #3: Uncontrolled audit Believe it or not, I actually look at log files. Automatic intrusion detection may be preferable, but in this instance we didn't yet have a system in place. Besides, looking at log files can fill in the time spent waiting for people. One day, I was looking at the tcp wrappers log file while waiting for the router to come back up. I noticed about six attempts to get in from one site. I left messages for the originating site contact and for my manager. No big deal. The next day, while waiting for some people, I decided to check the logs. We were actively being hit about six times per second. I told everyone to drop what they were doing because we had an active incident. The VP said he would call the internal audit team to find out whether or not it was them. We tried to contact the originating ISP with no luck. The logs showed that every external system was being actively attacked. We called other organizations to no avail. There was no response from the audit department, and the ISP didn't return our calls. I logged everything with appropriate times and had my partner initial them. The senior VP walked in and said he spoke to the general auditor who had no knowledge of an audit. With no word from the ISP or the audit department, we considered the ISP an accomplice in an attack. We contacted its provider, who was very responsive. We provided logs and prepared to have the ISP shut down. Our management authorized us to contact the FBI to report an intrusion. We finally reached the contact at the originating ISP and informed him that we planned to report him to the FBI unless he had an explanation. He called us back in a few minutes and gave us the home number of a person at a well known big-company auditor. After seven hours of about 20 people dropping everything else, the incident was finally acknowledged to be an audit. The follow-up meeting required earplugs. Bottom line: A lot of time was wasted by a lot of people -- some of whom were working on production problems during primetime. The audit department wasn't prepared for detection and didn't inform its upper management of the test. Also, the tool that was run caused a massive amount of network traffic during production hours. ------------------------------------------------------------------------ Advertisements ------------------------------------------------------------------------ The good audits Yes, there actually were a few! The two examples I give here had one thing in common: both sysadmins and auditors were truly interested in security and had an appreciation of each others' work. In the first situation, I was the auditor. In the second, I was the audited. As the auditor I don't really like doing audits. Having spent too many years as a system administrator, I have a healthy respect for the demands on sysadmins' time. Therefore, if I do an audit, I like to show every way that the administrator is doing a good job. In this case, I was surprised to be brought to a workstation and logged in as root. The VP shrugged and stated that he didn't want to waste time and assumed I could get into a system with physical access. He wanted to know if I could get into the trusted servers. I actually felt bad when I showed him I could. Rather than try to backpedal, he acknowledged the problem and outlined a plan of action to correct it. I still wrote up the problem in my report, but was able to state that it had already been addressed. As the audited From my experience, big-company audits are a pain and aren't to be taken very seriously. Therefore, I was caught off guard when a big company brought in a known hacker to perform the technical analysis on one of my systems. Realizing the system wasn't as hardened as it could have been, I was a bit wary. The hacker clearly expected an argument. We sat down over lunch and talked about security in general. When we went back to the lab, we both reviewed the system and came up with areas that could be improved. I actually enjoyed this audit. Unfortunately, we never received a written report so management didn't count the audit. However, it enabled me to improve security of the system and taught me to consider "impossible" scenarios. Therefore, I consider it a successful audit. Audit recommendations Here are a few important points to keep in mind when dealing with external auditors, reports, auditing departments, and system administrators. The external auditors Background checks It doesn't matter if the auditors are known hackers or not. They could just as easily be corporate spies hired out to perform audits. The company running the audit should provide background documentation on every person involved in the audit. Also, they should provide the audited department with a signed nondisclosure agreement from each person. Qualification checks Don't take the company's word for it. If possible, have one of the system administrators interview the auditors to make sure they understand the architecture. Many companies will object to this, stating that it interferes with keeping the audit a secret. I don't see the value in surprise audits. Audit parameters A contract should detail exactly which address spaces and/or telephone exchanges are to be probed. It should also specify the time of day and duration of the audit. The contract should have a realistic expectation of results as in hard exposures and theoretical exposures. Authorization and indemnification The auditors must have proper written authorization to perform the audit. This protects the auditors from being shut down by their ISP if the audit is detected. The report * Accuracy counts! If you list a path for a program, make sure it is correct. Show as much actual screen data as possible. If there is a vulnerability, detail how it could be exploited. * Document every test. Even if the system was protected, show that an exploit was tried and failed. Give the system administrators credit when they've done a good job! * Don't play politics. The report should be used as a tool to improve security of the systems. It should not be used as an opportunity to make an organization look bad or to create a revenue stream. * Make sure the report reflects the company's policy. * Blind audits consume more time and can produce inaccurate reports. I once reviewed a report that stated that "rootkit" was installed on a Cisco router. The audit department Scheduling Audit departments often schedule surprise audits. This can backfire if the audit impacts a production schedule. Running scans against a live trading floor is generally considered to be a bad idea. Control During an audit (especially a secret one) the audit department must be available 24 hours a day. The entire audit department must know that there is an audit in progress and be prepared for detection. Tools Understand the impact of every tool that will be used during the audit. Demonstrate these tools on a test network to see how "noisy" they are. Make sure the tool will not impact production. The system administrators Cooperate If external auditors are used, acknowledge that they will probably have more credibility than you. Use this as an opportunity to implement security measures that you may have wanted. Keep an open mind. Don't consider the audit to be a personal attack. Use it as a chance to learn. If security needs to be improved, acknowledge it and show plans for improvement. You may even get the budget for it. Don't strike back Whether you know this is an audit or not, resist the temptation to attack back. Follow an appropriate incident response procedure. Documentation Keep a log of everything -- phone calls, logs checked, etc. And make sure you log times. Cover your assets In these days of "hacking," the more mundane aspects of asset protection seem to have fallen by the wayside. A company can suffer just as much, if not more, if it's found to be running software without proper licensing. Many companies fail to consider that "freeware" may be for noncommercial use only. Source control is another boring but important factor in asset protection. Also, it's crucial that all necessary software can either be reloaded from distribution or recompiled from source. Don't neglect the traditional aspects of asset protection for the trendy ones. Followup to last month There have been several e-mail messages in response to last month's column regarding padded cells using the chroot system call. Some valid points stress that chroot is by no means infallible. Indeed, if someone managed to become root in the chrooted environment, they can break out of the cell. This does not, however, render the use of chroot pointless. By limiting the programs used in the padded cell and carefully screening the programs that are used, there is a reduced probability that a program in the cell will be exploitable. It is important to note that the administrator must be very careful with the programs that are put in the cell. Thanks to Simon Burr of Uunet in the UK for bringing up some very interesting points. Bug of the month The most notable alert this month has been the Trojan horse version of TCP Wrappers (http://www.cert.org/advisories/). It demonstrates the importance of verifying the source of any software. In the news: Frontier justice This month divergent groups are promoting the use of vigilante tactics to achieve their goals. According to a CNN article, corporations are employing "strike back" tactics to discourage hackers. (See Resources below). According to the article, one company, with management approval, traced an attack to its physical location, broke in, and stole the attacker's computers. According to Hacker News Network, a group claiming to be hackers in the Legion of the Underground declared "cyberwar" on the computer infrastructure of China and Iraq. In both cases, it's difficult to verify the stories. I've never seen a company approve "strike back" tactics. Many even have policies against such behavior. The Legion of the Underground (faced with a remarkable lack of support from the hacker community) stated that the declaration of war was blown out of proportion by the media. Both stories demonstrate a dangerous mindset for revenge at any cost. The hacker community was quick to condemn this mindset. There has been no conclusive response from the corporate community regarding corporate vigilantes. An Irish ISP, Connect Ireland (www.connect.ie) claims to have been targeted for cyber-warfare by the government of Indonesia for hosting the top-level domain of East Timor. Connect Ireland was forced to shut down for two days. [Image] Disclaimer: The information and software in this article are provided as-is and should be used with caution. Each environment is unique and the reader is cautioned to investigate with his or her company as to the feasibility of using the information and software in the article. No warranties, implied or actual, are granted for any use of the information and software in this article and neither author nor publisher is responsible for any damages, either consequential or incidental, with respect to use of the information and software contained herein. Click on our Sponsors to help Support SunWorld ------------------------------------------------------------------------ Resources Resources mentioned in this story * Link 1: Tan's Cyber-UL paper http://www.l0pht.com/cyberul.html * Link 2: CNN Cyber-Vigilante article http://www.cnn.com/TECH/computing/9901/12/cybervigilantes.idg/index.html * Link 3: Legion of the Underground statement http://www.hackernews.com/archive.html?011099.html * Link 4: Hacker joint statement http://www.hackernews.com/archive/louwar/jointstat.html * Link 5: LoU retraction http://www.wired.com/news/news/technology/story/17273.html Audit tools * SANS Institute http://www.sans.org/NSA/sectools.htm * Tool recently released by Dr. Mudge of the L0pht http://www.l0pht.com/advisories/tmp-advisory.txt Other resources * "The Dark Side of White Hat Hacking: Being 'Owned' by White Hat Hackers". This is an interesting article presenting a perception of corporate audits. Fortunately, real life doesn't always follow this course, but there are some valid points. http://www.hackernews.com/orig/whitehat.html * SANS Intrusion Detection Briefing with Stephen Northcutt http://www.sans.org/hackers2/what_h~1/index.htm * "Creating a basic padded cell," January 1999 Wizard's Guide to Security column http://www.sunworld.com/swol-01-1999/swol-01-security.html * Full listing of previous Security columns in SunWorld http://www.sunworld.com/common/swol-backissues-columns.html#security * Peter Galvin's Solaris security FAQ (which has just been updated!) http://www.sunworld.com/sunworldonline/common/security-faq.html * Peter Galvin's Unix Secure Programming FAQ http://www.sunworld.com/swol-08-1998/swol-08-security.html * SunWorld Site Index -- topical listing of our most popular stories http://www.sunworld.com/common/swol-siteindex.html * sunWHERE -- launchpad to hundreds of online resources for Sun users http://www.sunworld.com/sunworldonline/sunwhere.html * SunWorld's back issues http://www.sunworld.com/common/swol-backissues.html ------------------------------------------------------------------------ About the author Carole Fennelly is a partner in Wizard's Keys Corporation, a company specializing in computer security consulting. She has been a Unix system administrator for more than 15 years on various platforms and has particularly focused on Sendmail configurations of late. Carole provides security consultation to several financial institutions in the New York City area. Reach Carole at carole.fennelly@sunworld.com. What did you think of this article? -Very worth reading -Too long -Too technical -Worth reading -Just right -Just right -Not worth reading -Too short -Not technical enough [SunWorld] [Table of Contents] [Search] [Next story] [Subscribe to SunWorld, it's free!] [Feedback] [Sun's Site] [(c) Copyright Web Publishing Inc., and IDG Communication company] If you have technical problems with this magazine, contact webmaster@sunworld.com URL: http://www.sunworld.com/swol-02-1999/swol-02-security.html Last modified: Thursday, February 04, 1999