I’ve been thinking back on my conversation with a cybersecurity pro named Stuart that I covered earlier in “The Frightening State of Unseen Security Breaches,” and his approach to not just protect the file and email servers but wrap everything with monitoring. The one thing that I’ve seen kill companies over and over again -- the thing he was addressing -- is the assumption that everything you aren’t looking at is OK. It actually cost me a job once.
We can actually see assumptions working against the presidential candidates as I write this. Someone in Hillary Clinton’s camp evidently thought a way around a disclosure demand was to use their husband’s computer, and Donald Trump seems to assume that what is said “off the record” is off the record. Had either of these assumptions not been made, the race for the White House would be very different at the moment.
What made Stuart’s approach with Varonis unique is that he wasn’t assuming anything; he created a solution that was comprehensive enough that he never has to. And I think there is an important lesson here that I’ve learned a number of times.
I used to run an internal audit team that was pretty unique then (and now), because we were staffed with a rotating membership of highly qualified practitioners and had a unique mission. We were not only required to find exposures, but to train our guest auditors in what to look for and why addressing these exposures was important. We had the power to fire people but, if we did, we had to backfill them. Our group was the brainchild of our then CFO, who had discovered the concept of tiger teams and felt that would be one hell of a model to emulate.
Finally, we not only had to find the exposures, we had to lay out a plan to address them. This last requirement moderated our results, so we tended not to shoot folks who were making mistakes because they were overworked and underfunded because, unless we could address those two issues, it would be our asses on the line next. To my knowledge, ours was the only team that was requested even outside of an audit schedule because, if a unit was having problems they couldn’t address, we’d come in and address them and, if we were invited, we tended not to shoot the person who invited us. But we were incredibly harsh if we found an executive covering something up.
Even though the work schedule had me on the road three of four weeks, 16-hour work days during the week, and another 12 on weekends, it was incredibly rewarding because we actually fixed things. But our group was disbanded when the firm fell into financial collapse, largely because we just focused on the divisions -- our CFO and his successor assumed their own shop was well run but, sadly, it wasn’t. By the time the firm realized that, it was too late.
Before I left the job, I had one final audit: a simple security audit of our CEO, who was a stickler for security, often being outspoken on how his executives didn’t “get it” and that only someone with his training at IBM could possibly understand the need and the method of securing a firm adequately. Had it been up to him, he would have never been audited, but this was a requirement for compliance and apparently the then CFO felt no need to give the CEO, who clearly was a legend in his own mind on security, a heads up. It turns out that, for convenience, his secretary had all the keys to his office and confidential safe. She didn’t like carrying the keys, so she put them in her desk and, wouldn’t you know it, the day of the audit, she didn’t lock her desk.
By the way, I wasn’t allowed to present my findings in person because my boss felt I’d look like I was enjoying it too much. Sadly, he was right.
Shortly after the first audit, where I was leading the team, embezzlers were discovered in the unit we’d just audited. I was crushed because I felt I’d let the team down and the problem was discovered in an area that I’d looked at that the problem, making it doubly embarrassing. I had reported that embezzlement was likely and we had sampled heavily looking for it, but the fraudulent invoices just weren’t part of the sample and we missed it. The heart of the problem was a lack of separation of duties, which allowed someone to both approve and issue a check. The CFO for the unit defended this behavior based on a lack of staff and confidence that his people wouldn’t cheat the firm.
The theft was for about $500K, and it highlighted something else we later started looking for: a person who wasn’t taking vacations but, of course, when they did a theft would be discovered. The assumption that people are honest is a common one. People are human and financial and social pressures and substance abuse issues can make otherwise honest people make mistakes.
Security isn’t about assuming someone is dishonest or honest. It’s about protecting the asset in an environment that has people in it and recognizing that you can’t assume anything with people. So, you create a system where you don’t have to. Behind every security breach, there is generally one or more false assumptions. Discovering and eliminating them before the breach is one of the only ways I know to actually prevent them.
After our chat, Stuart sent me a nice note and wrote the following, “I used to tell all the engineers and consultants I have trained that ‘It isn't the politicians, the bankers, the behind-the-scenes people they should fear; it is us in the IT world especially in security, for we control access to the data. That’s why we must hold ourselves to a higher ethical standard than any other group in the industry.’ I came up with that about 25 years ago.”
He set up a process where he isn’t assuming anyone is honest or dishonest but where the assets are protected regardless. He even set this up so they are protected from him. Because if you can’t trust your IT security people and you assume you can, you will be totally screwed.
By the way, I’ll leave you with another story. Stuart and I had something in common; we both told our CEOs they were doing something stupid to their face and survived it. Stuart’s CEO was smarter, though, and funded the security fix. Mine wiped out two-thirds of our revenue (my point was not on security but on sales commissions, where I was focused at the time), which once again points to the benefits of working for a thinking CEO.
Rob Enderle is President and Principal Analyst of the Enderle Group, a forward-looking emerging technology advisory firm. With over 30 years’ experience in emerging technologies, he has provided regional and global companies with guidance in how to better target customer needs; create new business opportunities; anticipate technology changes; select vendors and products; and present their products in the best possible light. Rob covers the technology industry broadly. Before founding the Enderle Group, Rob was the Senior Research Fellow for Forrester Research and the Giga Information Group, and held senior positions at IBM and ROLM. Follow Rob on Twitter @enderle, on Facebook and on Google+.