It’s been said that in order for the enterprise to transition from the static architectures of old to the dynamic world of the cloud, it will have to become more “data-centric.” But what exactly does that mean?
Most enterprises already rate their data as one of their most valuable resources, and are taking great pains to organize, analyze and utilize it to their best advantage. How, then, can the enterprise become more data-centric than it already is?
The first step in the process is realizing that data-centric does not imply that enterprises need not worry about infrastructure entirely. Infrastructure will always form the basis for data environments and should be nurtured and maintained as it always has been.
In a data-centric architecture, however, the tools and platforms used to manage and secure data will no longer reside on the server, storage, network or even virtual layer, but within the data itself. As Vormetric VP Paul Ayers notes, this allows the enterprise to enforce security, governance and other policy-based functions no matter where the data resides. This is crucial when devising a truly dynamic cloud environment, as it can be very difficult to keep track of where data is housed once it leaves the enterprise firewall.
On the downside, this will require a fundamental shift in attitude within the IT department, which has long adopted a fort mentality when it comes to management: Secure the perimeter and ensure a safe working environment for those inside. But as data management guru Malcolm Chisholm points out, the data-centric enterprise will function more like an open laboratory where “data scientists” can access infrastructure to create new forms of data that have all the tools necessary to interact with the wider world, or not, embedded inside. In this way, innovation and productivity will flourish while security and stability are maintained.
Part of this process is getting IT and other stakeholders to accept the fact that in a data-centric world, outside will most likely gain access to your data. But that’s OK, because as long as security and other components remain intact, they won’t be able to do much with it. Of course, no security measure is foolproof forever, which is where data lifecycle management comes in. If access can’t be denied permanently, at least it can be delayed long enough so that it is of little value once revealed.
This is why platform providers like Hewlett-Packard and IBM are quickly adding software and data lifecycle management tools to their portfolios as enterprises come to the realization that with infrastructure becoming a fungible asset, it’s the data that needs greater management support, not the systems. A key aspect of this effort, however, is the need to identify and classify data to ensure that information with long-term value receives the highest levels of security.
Data-centricity is one of those trends that upset a lot of apple carts on the way to acceptance. The management and security practices at most enterprises have long histories behind them, and the experience gained from developing and perfecting these techniques should not be lightly tossed away.
But the fact remains that the old ways of doing things are no longer optimal in the highly fluid data environments that are emerging all around us. And as use of the cloud and other third-party services becomes more common, infrastructure-based management will prove to be neither effective nor cost-efficient.