The enterprise has taken to cloud storage so quickly that it is now the rare organization that does not host some of its data outside its own infrastructure. The primary drivers, as in most things, are cost and convenience, with even individual business units finding it easier to spin up cloud resources than request them from IT.
Still, the fact remains that the cloud constitutes an isolated environment in the overall enterprise data infrastructure, one that is both slower and difficult to manage using existing means and practices.
That may be about to change, however, as new standards and services emerge with an aim toward integrating cloud resources more closely with traditional IT infrastructure.
A key development came from the International Organization for Standardization (ISO) in its ratification of the Cloud Data Management Interface (CDMI), which defines a series of protocols that fosters interoperability between public and private clouds. The system was devised by the Storage Networking Industry Association as a means to carry metadata across internal and external infrastructure so data can be accessed no matter where it is physically housed. It employs the RESTful protocol to maintain both data and control paths, as well as a common security format, even as data is bounced from cloud to cloud.
Cloud providers are also making strides in breaking down the barriers between their platforms and internal enterprise infrastructure. Rackspace recently launched the Cloud Block Storage service, which it bills as the means to maintain consistent performance for file systems, databases and high I/O applications. Based on the OpenStack protocol, the program features standard and high-speed SSD tiers and can deliver up to 1 TB to a single cloud server, providing an integrated platform across public and private infrastructure, even if it extends beyond the Rackspace cloud.
Data management firms also have a vested interest in bridging the public/private divide. Novell just released the Dynamic File Services (DFS) platform, which promises dynamic file-based management of unstructured data across multiple cloud providers. The cloud is a convenient means to handle the onslaught of unstructured data, but without a unified management stack, it can often be difficult to mine and run analytics in a coordinated fashion. DFS includes tools like an intelligent policy engine that unifies archiving across multiple clouds and storage platforms, as well as tiered data management and Web-based retention review that streamline compliance and tracking functions.
Still, there are those who argue that the cloud represents a unique opportunity to remake functions like storage from the ground up. Why, for instance, must storing data be a completely separate process from other application activities, particularly when the application itself resides in the cloud? The latest update from Box includes a feature that embeds storage directly in the application so everything is backed up automatically. There’s no need to manually save files, assign names or perform any other storage functions because the app knows where your data is and how to retrieve it. The service is likely to be a boon to mobile users who seek to simplify manual processes as much as possible.
Ultimately, breaking down the barriers between the data center and the cloud will require enterprise environments to become more cloud-like. It’s a rare example of established technologies having to conform to the newcomers, rather than the other way around.
It’s in the enterprise’s best interests to foster this transformation, however. The cloud is needed to handle the growing volumes and increasing complexity of modern data environments, but the enterprise needs a unified environment to ensure that its data can be readily accessed and put to good use.