The cloud can certainly lower the cost of provisioning and maintaining enterprise data infrastructure, but does it really provide an operational benefit over your existing architecture?
Ultimately, it will, but it is not a simple matter of calling up a cloud provider, signing a few contracts and then reaping the rewards of an infinitely scaleable, dynamic infrastructure.
For one thing, the cloud will have to integrate directly with your existing data environment. For most organizations, that means making on-premises infrastructure more cloud-like, a feat that is made much easier through the introduction of virtualization. Keep in mind, however, that virtualization is only the first step toward an internal cloud architecture. If that was all there was to it, VMware would already have a lock on the cloud market, something that Information Week’s Charles Babcock says is far from clear in light of the company’s third quarter results.
Clearly, then, something is missing. It’s not enough for enterprises to strive for cloud functionality. What is needed is cloud optimization — a way of utilizing the unique properties of the cloud to make IT not only cheaper, but better.
Companies like Datawatch Corp. say they are working toward that goal with new generations of cloud operations software designed to build the kinds of data architectures needed to handle Big Data and other challenges. The company’s Datawatch Enterprise Server − Cloud is designed to coordinate application functionality across varying data types, including structured, unstructured and semi-structured, and then improve the delivery of both applications and data over disparate infrastructure. The platform is built on the company’s Monarch system designed to provide full data integration to enhance analytics and other functions as enterprises face the pressure of increasing data loads.
A company called Yottabyte is also working toward more cloud-ready data environments, having just released a new version of its cloud operating system designed to let enterprises of all sizes build, provision and scale public, private or hybrid clouds using commodity components and the company’s own hosting services. The system relies on a local appliance to house the free Yottabyte Community Beta Edition, which includes its own file system and Web interface to enable resource pool capabilities, as well as storage area networking, backup/restoration, encryption and other functions. The system also provides a KVM hypervisor platform, as well as support for Windows and Linux applications.
But since the cloud relies on highly distributed architectures, a key aspect of optimization is improving network traffic. A company called Gnodal is working to improve data flow through a new Ethernet-based switch that uses network intelligence to circumvent the data collisions that hamper highly complex configurations. This provides for real-time re-routing in a way that maintains connectivity even if users, applications and underlying hardware are dispersed over great distances.
The cloud represents an entirely new data architecture, and therefore requires a new way of thinking when it comes to management and operations. At the same time, enterprises have a vested interest in maintaining existing infrastructure even is it undergoes the transition to more cloud-like functionality.
In the end, we should all see more efficient, flexible data environments, but only if enterprise executives accept the fact that the cloud is not just a cheaper version of existing IT environments, but the foundation for new levels of data flexibility and worker productivity.