If you don't have your car inspected for several years and then some equipment failure causes an accident, the owner of the car is usually held liable.
Or what if you leave your car door unlocked with the keys in the ignition, and somebody takes the car and uses it to harm people in the commission of a felony? And what if, despite repeated warnings, that scenario played out time and again?
How long would it be before there was a hue and cry to be more responsible in terms of locking up our cars at night? Given the number of times that IT systems are being compromised lately, you can't help but wonder when a similar backlash might be heading the way of IT departments.
The fact of the matter is that hackers and criminals are not going away in real life or cyberspace. But if you fail to take reasonable measures to secure things that could be used in a crime, how culpable is the organization that fails to take those measures?
A recent survey conducted by Tufin Technologies, a provider of firewall management software, finds that of the 242 IT professionals working at companies with over 1,000 employees, 30 percent said they audited their firewall security only once in every five plus years. So when those systems are ultimately compromised, and then used to harm systems elsewhere, who should be liable? Obviously, the perpetrator of the crime is primarily responsible. But shouldn't the organization that failed to take reasonable security measure carry some burden of the blame?
Obviously, it would be up to a court to determine what level of blame to assign. But even in the physical world, police departments are telling organizations to better secure their assets. For instance, the police commissioner of New York City recently told retail banks that they were making their premises too inviting to rob, which was putting the public and police officers at risk. The city of New York hasn't taken the matter any further, but banks were put on notice that they are responsible in part for their own security.
How long will it be before some enterprising litigator sues a company on behalf of another company that experienced some sort of data breach because another company that has integrated their systems with that company’s systems failed to take reasonable precautions?
Michael Hamelin, chief security architect for Tufin Technologies, says that new compliance requirements along with increased awareness of the security risks associated with any systems or applications integration project is driving IT organizations to take a much stronger stand when it comes to security auditing. In addition, Hamelin notes that the security auditing tools have become much more automated, so there really isn’t much of an excuse available when it comes to failing to audit IT security.
Right now, most IT security audits operate on the honor system. But the Tufin Technologies study did find that one in 10 security professionals said they knew of somebody that had cheated on a security audit. Hamelin says that's a major improvement from just a few years ago, when it was five in 10.
But those improvements not withstanding; it's pretty clear that most IT organizations think of security in terms of their own internal requirements, rather than their obligations to the business ecosystem at large. And yet, we all know that an inevitable day of security reckoning brought on by some catastrophic system-wide breach that affects hundreds of companies at the same time is not all that far off.