Five Warning Signs Your Security Policy Is Lacking
Warning signs of a weak security policy from SunGuard Availability Services.
In early October, the Government Accountability Office released a report that showed the weaknesses in the government's cyber security efforts and a 650 percent increase in incidents over the past five years. Another recent study mentioned that small businesses think their networks are safe but an overwhelming number aren't doing much of anything to ensure security.
Hence, it seems like a lot of the news regarding cyber security, government and industry sounds like failure. But in a conversation with Paul Kocher, president and chief scientist for Cryptography Research, I learned that focusing a bit more on failure could be just what we need to improve overall security.
When we implement data security mechanisms, Kocher said the goal is to enforce policies limiting the use of information and resources. The failures occur when policies are violated or the enforcement of the security mechanism is ineffective.
Kocher said that right now, the same computer security problems keep reappearing because the technology industry isn't doing a good job of learning from past mistakes. For example, he said when an airplane crashes, a detailed study is done to understand the various contributing causes, and the results are disseminated to help ensure that the same problem won't recur. In most cases, the most important issues identified are not the immediate cause of the crash, but rather larger issues having to do with design and maintenance of planes. He continued:
In contrast, when computer security failures occur, the focus is usually limited to identifying a single immediate vulnerability, while bigger-picture issues like user interface design, software architectures, network topologies, testing methodologies, and project management approaches get comparatively little attention.
To me, Kocher's approach sounds reactive rather than proactive. I would think that being reactive invites more problems. But Kocher said better understanding of why things are going wrong in computer security is essential for making progress on solving the problems. He told me:
In terms of individual vulnerabilities, reactive approaches are a necessary evil for the foreseeable future. Given the choice between patching a widely-known vulnerability and fixing it, clearly fixing it makes sense. That said, the most important adversaries are those capable of finding and exploiting vulnerabilities, and patching does little good if other un-patched vulnerabilities remain. Ultimately, however, we need to get systems to the point where there aren't huge pools of undiscovered vulnerabilities.
Finally, I asked Kocher what types of things you think IT security folks can learn from other companies' mistakes or security breakdowns. He replied:
The biggest problems lie upstream of IT departments and users. Products are riddled with bugs, patching demands conflict with business needs, and vendors are incentivized to close sales instead of being honest about security risks. To help deal with this, purchasers can structure vendor relationships to mandate open sharing about risks, have vendors share in the costs associated with failures (or receive rewards for successes), and increase payments over time to vendors who improve security. Another aspect of the problem for IT departments is how to be creative in trying to recognize if failures are occurring, since stealth is a big part of attackers' strategies. Even a very simple approach, such as placing a tempting-looking URL in a document and then watching server logs to see if the URL is visited, can be very effective.