Auditing of IT systems is a fundamental part of the verification of security measures to ensure that what is expected to be implemented actually is. Like all forms of audit, it measures the actual implementation against a set of standards; it is a subjective assessment of objective requirements. I am sure none of this is new to anyone and is an accepted view of what it means to audit.
Having spent years working in security, I am always unnerved by the variety of interpretations applied to any supposed standard. The word "standard," by definition implies a fixed and known quantity against which things are measured. Yet when it comes to auditing against, say ISO 27001, the subjectivity that can be brought to bear is worryingly varied. It makes sound business sense to prepare for audit as thoroughly as possible, regardless of its nature. Yet there is rarely a guarantee of success, no matter how much you prepare. In one instance, a company spent several months preparing for its audit, even employing an auditor qualified in the desired standard to act as a consultant. Still, when the audit was performed by a different auditor, it took several more months to correct the areas of non-compliance cited. This anecdote is not in any way intended to slight either of the qualified auditors, but to illustrate the diversity of opinion as to what constitutes a successful implementation of a given criterion.
The Ministry of Defence (MOD) for the United Kingdom has, in recent years, continued to bemuse many of us with a catalog of information security blunders followed by either a knee-jerk response or a lengthy study. Yet here is an organization stifled by self-imposed regulatory audits; IiP, Health and Safety, IT Accreditation, as well as the external departmental auditing of the National Audit Office. Given all this apparent attention to detail, why do things continue to go wrong? The answer is that the size of the ministry and its armed forces means that oversight is not provided uniformly by a single organization or individual. Instead, separate groups provide a variety of audit services against a variety of standards, which leads to huge variations in subjective interpretation. The net result is a wide-ranging variable application of each standard as well as variation in whether a particular standard is even applied. In effect this single government department cannot guarantee that its own information will be uniformly looked after in line with any given set of rules.
Take the measures put into place following the loss of a MOD laptop containing the personal details of thousands of military recruits. The net effect froze the movement of all laptops until they had been encrypted, a process that took months because of financial constraints. Why weren't they encrypted in the first place, I hear you all cry? Because many of the laptops were not required to be encrypted by the policy in force at the time, or, if they were, it had been decided to accept the risk of them being lost or stolen. This illustrates once again the point that subjectivity plays a key role in undermining certain aspects of security policy. The issue is that whilst an auditor can declare that a given process or the implementation of a particular technology meets the requirements of the standard against which the audit is being conducted, it is difficult to foresee all possible circumstances in which that particular requirement might need a more robust implementation. For example, a requirement for a control on the management of removable media exists. The implemented control is a media register where all media is recorded and any removals from the department are also noted. The auditor accepts this and so on the face of it, the requirement is achieved. However, suppose that what has not been taken into account is what actually constitutes removable media within the organization, or what constitutes a removal from the department (if the department had multiple buildings for example). So despite a successful outcome during the audit, the implemented control is incomplete and it would be down to the auditor to notice this in his or her assessment. Different auditors, using their own judgment and skill might either miss the full circumstances or even consider them irrelevant.
Incidentally, is your laptop encrypted, given that you probably hold sensitive client data on it? If the answer is yes, then you and I are probably in a minority.
The apparent failure in both the MOD case was in fundamental policy enforcement, neglecting to educate staff in the basics of information and media security, and the lack of workable processes within the business to limit the impact of a loss. In other words, all the right elements were present to demonstrate compliance against a standard, but these elements lacked cohesion and support across the whole department.
Of course, concerns regarding data security go deeper than just loss in transit. Information is also at risk within the corporate infrastructure, an area where much more audit time is focused not only in relation to formal standards, but also the correctness of a system and its security barriers. But the same issue still presents; a vulnerability audit still produces a wide array of results and recommendations depending on the individual auditor. Cases of a vulnerability analysis report making recommendations against items that were high risk or critical, only for a second audit six months later to report that some of the earlier problems were still present but were considered as trivial.
Whilst it is arguably impossible to remove the subjective element of an audit without rendering the whole process inflexible or unworkable, there is a need to recognize that certain elements of IT security are simply the foundation stones of more complex measures. There should be no need to assess these foundations in any greater detail than a check that they exist. With the right amount of detail as to what constitutes a particular base level requirement, the audit becomes a simple yes or no answer to the question of whether it is present. Elements such as a basic policy on e-mail use, web browsing and user accountability are ripe for a template that fits into any organisation operating IT systems within the bounds of the law. Not only would this provide a simple means of demonstrating a degree of security practice, it would also engender a more objective approach; does the policy exist in the organization, and does it contain the elements X, Y and Z? It either does or it doesn't. Such an approach doesn't take into account the complexities of some organisations, or the context in which security is applied, but arguably the fundamental building blocks of security are neither complex nor contextual. Do you lock your car whenever you leave it, or only if it is on a public road after 6 p.m.? Assuming that your answer is "whenever you leave it," then locking your car is neither complex nor contextual.
This is not an advocacy for a complete abandonment of contextual auditing, but rather a plea for objectivity. Many organizations are unable, through size or financial position, to expend resources on auditing against a recognized standard, partly due to some of the complexities, but also because there is no guarantee of moderate success. This inability, coupled with their uncertainty, leads many to forego even the fundamentals of security. Worse still are those who achieve the basics, but are unable to demonstrate their success and without that demonstrable compliance are unable to attract business from clients who demand certain levels of security.
The media industry, with its move toward greater online access to music, video and computer games, has experienced many incidents of data loss often in the form of piracy of material ahead of official release. So much so that there is a real drive toward securing pre-release material using techniques that would not be out of place in a Le Carre novel, and backed by the threat of legal action. Online content, by its very nature, needs to be accessible; yet once it is accessible to the world at large, it is, for all security purposes, compromised. Once the latest blockbuster film, or hit album is released, it becomes fair game for the media pirates and most of the media industry takes the view that the critical element is to be the first to make material available. It is the early weeks of public access that generate the most interest. If the pirates are first in the marketplace, it has a marked effect on sales revenue when the genuine article arrives later. Strategies range from non-disclosure agreements with reviewers, technical sweeps of preview events, searches of employees at CD-pressing plants and the pre-release material going under a pseudonym. But because of the complexities of achieving an audited standard, each pre-release undergoes a repeat of these types of measures, with increasing security demands on outsourcing subcontractors-CD pressing, cover artwork, digital mastering studio, the list goes on.
Industry as a whole, not just the security industry, has a need for a means of mandating simple, manageable and scalable practices that can be passed between companies and organizations, which enable all parties to understand each others' security position. This means a system that affords verification between those parties without the need for protracted contractual negotiation or intrusive validation by the contracting company, or worse still a reliance wholly on trust. Trust is the absence of a control measure!
Such a system could implement the fundamentals of information security across a whole enterprise, in staged implementations, allowing organizations of all sizes to grow their security incrementally in line with their business needs rather than trying to meet all the requirements of something like ISO 27001. Companies clearly understand the value of their information and the impact a loss will have on their business, which means they should also understand the level of investment required beyond the basic (and objective) measures all businesses should have. Information security should not be left to chance, but many organisations fail to implement it for simplest of reasons; first they do not recognize the need to have protection. Second, they don't really see a value if no one else can see what money they have invested and therefore accept that they are taking security seriously. And since that often hinges on the subjective opinion of an auditor, the option is sometimes to spend the money on very apparent security measures that achieve very little in the way of practical security such as the oft heard, 'I know we're safe, we have a firewall.'
(Certified Digital Security is exhibiting at Infosecurity Europe 2010, the No. 1 industry event in Europe held on 27th-29th April in its new venue Earl's Court, London. The event provides an unrivalled free education programme, exhibitors showcasing new and emerging technologies and offering practical and professional expertise. For further information please visit www.infosec.co.uk)