Invaluable Lessons for IT Security Pros from the Edward Snowden Case

    As dark and damaging as it is, the case of Edward Snowden, the former National Security Agency consultant who continues to leak highly classified and extremely sensitive information about NSA’s intelligence-gathering operations, has a silver lining. It serves as a catalyst for corporate Chief Information Security Officers (CISOs) to ensure they have safeguards in place to help prevent this sort of debacle from happening in their companies

    Slide Show

    Study Shows Employees Steal Corporate Data and Don’t Believe It’s Wrong

    An individual with an especially knowledgeable perspective on the Snowden case and one who also works with CISOs on a daily basis, John Dickson, is principal at Denim Group, a San Antonio-based provider of secure software products and services. Dickson is a former U.S. Air Force intelligence officer with extensive experience in network and software security within the U.S. intelligence community. He currently serves as a civilian cybersecurity advisor to the Air Force.

    I spoke with Dickson about the Snowden case on Monday, and I noted that Snowden had access to this massive repository of highly classified information in his reported capacity as a systems administrator (aka an IT guy). Given that in the corporate IT world we talk about the insider threat all the time, I asked Dickson what lessons CISOs can learn from the case. For starters, he said, it’s about data leakage protection (DLP)–systems that look at data that’s leaving companies and catch keywords to address the insider threat. According to Dickson:

    The DLP world is understanding who has access to what. If I’m a corporate CISO, what I probably want to know is, do I have a data classification program in place, and how am I segregating my data. This guy had unprecedented access to a lot of data, begging the question, how could somebody that junior have access to so much stuff? A lot of people focus on authorization or access issues for regular users, but they forget the part about systems admins. And what you have to do with admins is you have to segregate their systems–you can’t give people universal [administrative privileges] for everything. That’s what happened in this instance–they probably restricted information going to analysts and regular intelligence users, but the sys admin guy holds an extra-trusted role. You want to segregate information; you want to log information accessed by these admins so that somebody else can look at it. You log it off in a cryptographically irreversible way so it can’t be tampered with. So there’s a ‘flight recorder’ of data that says, ‘OK, this guy has super-admin rights on Active Directory,’ but somebody over here is able to recreate what he did in case he was doing weird insider-threat things.

    Dickson highlights the essential nature of data classification–ensuring that especially sensitive data, like HR and M&A information, is treated differently:

    If you’re doing a super-secret acquisition of Company X, you may want to completely pull that and say, ‘We’re going to buy five or six laptops; we’re going to have these guys in a war room over here, and they are not going to connect to our network, just to be on the safe side.’

    Dickson went on to explain that what’s been happening within the intelligence community since 9/11 is that the fundamental “need to know” criterion that had previously defined access privileges has largely been abandoned:

    The other thing [CISOs] can learn from this is that post 9/11, the ‘need to know’ drive got turned around. Those barriers, both internally to the organization and externally, were largely broken down, because we wanted to get as much data as possible to connect the dots. What a corporate CISO can learn from this, say, if there’s a big data project being rolled out, is to recognize what the [security] implications are of consolidating all of this data.

    I’ll give you a great example. I’ve heard war stories from clients, where they’ll put a Google search appliance within their enterprise, and it’ll crawl everything. So you’ve essentially done internal to your private network what Google does to the Internet. Guess what shows up in these things if you don’t do it correctly? The CEO’s paycheck; HR stuff; travel and expense reports of executives. If this big data is all in one place, what are the downsides to that? Do you have to further segregate it or add protection layers?

    Moreover, Dickson said he’s a big believer in much more extensive background checks for systems administrators:

    Typically for employees, you do a criminal background check to find out if they’ve been arrested in your county or state, and maybe if the person was involved in some type of lawsuit. Those searches, I’ve learned as an employer, can be pretty narrow in scope, or they can be broad and extensive. For an admin who’s going to have the keys to the kingdom, you probably want to do a little bit more than your standard, cursory background check. You probably want to do multiple states, and interview people who know this person.

    I asked Dickson if he would surmise that there are some best practices that NSA didn’t observe in this case that allowed Snowden to get access to more information than he should have had access to. He said that would be pure conjecture on his part, because he’s not privy to NSA’s processes:

    But what I would say is that arguably, post-9/11, with Iraq and Afghanistan, there has been an influx of contractors and GS [pay scale] civilians. My sense is, anecdotally, that their filters were probably less rigorous than they were before that. It was all about streamlining the process to get more people–Urdu linguists or Persian linguists, whatever–that was the pressure of the last decade, and that just changed. So now, if you’re doing an SBI [special background investigation] or a five-year recert [scheduled security review of the employee every five years], the next round of people are going to get the ‘full monty.’

    Finally, I expressed my own view that given the pervasiveness of technology, the nature of the digital world, and the continuous breaking down of barriers to information collection, it seems we’re not too far away from the day when it’s going to be virtually impossible to keep anything secret. I asked Dickson for his thoughts on that. His response:

    That’s a very interesting question. The way to put it is not all information is of the same value. The way that it has worked is the more valuable the data, the more protections that were put towards it. So to make a statement like that is really not taking any of that into account. What this might do is help us have a better understanding of what data is out there, and maybe make those decisions a little more clear. … You can’t just put stuff up anymore. You can’t just give willy-nilly, blanket access to a contractor. In the DoD, that wasn’t the focus. The focus was, ‘let’s catch terrorists, let’s shut down the Taliban.’ I think that will change, and that will have a profound effect. The consequences will be that it will be harder to get a clearance; you’re going to have access to less stuff, you’re going to have to justify everything. We will go back to where we were pre-9/11–you’ll have to be in-briefed into special programs, which is good, except you’re going to have the stovepipe issues come out again. That killed us on Sept. 11, and it killed us before that.

    Latest Articles