Given the massive amounts of data involved, it was only a matter of time before somebody started talking about the need to apply compliance controls to Big Data platforms such as Hadoop.
Dataguise today announced that it has become the first company to provide a framework for managing compliance in a Hadoop environment in the form of DgHadoop, which is designed to enable compliance assessment and enforcement of centralized data privacy policies.
While nobody is in love with the idea of compliance, the fact remains that securing data in a Hadoop environment is something that needs to be addressed. The more data there is centralized in a particular system, the more likely it is that someone is going to want to compromise that system. IT organizations are going to need to find the simplest, least painful way of demonstrating that measures were put in place to secure that data.
Dataguise CEO Manmeet Singh says dgHadoop is an extension of the company's existing compliance framework that is built on a service-oriented architecture (SOA). The benefit of that approach, says Singh, is that IT organizations can use the same core architecture to secure data in both traditional enterprise applications and Hadoop.
Before engaging in a Hadoop project, Singh says, IT organizations need to assess exactly what risks are involved. Almost by definition, the data that most organizations want to put into Hadoop is some of the company's most valuable data in its rawest form. As such, Singh says it's not always immediately obvious to everybody working in a Hadoop environment how sensitive that information can be.
Like it or not, Hadoop is no exception to the new realities of IT compliance. In fact, if anything, the whole focus on Big Data these days is probably going to do more to create a new level of urgency around compliance than perhaps any other new emerging technology in recent memory.