It’s easy to talk about the cloud as a single entity, but the fact is that, like real clouds, there are many more out there, and they come in a stunning array of varieties.
This poses more than just an aesthetic problem for the enterprise, as there is no guarantee that one cloud will be compatible with another or that data stored in one location will be available to applications hosted somewhere else.
The goal, of course, is one giant, integrated data ecosystem spanning internal and external resources – distributed, scalable and highly dynamic, but at the same time fully automated, interoperable and secure. Anything less and the enterprise runs the risk of building the same vendor-specific, silo-based architectures it has at home.
Interoperability is nothing without agreed-upon standards, though, and at least with the cloud the push for a common set of interfaces is already well under way. According to John Messina, co-chair of the cloud computing reference architecture working group at NIST, generally accepted interoperability standards are three to five years out, which gives the enterprise plenty of time to experiment with the various cloud options before committing critical applications to outside resources.
But even then, interoperability won’t be available at the snap of a finger. Indeed, says Sony Entertainment IT Director Ian Cox, IT will have to take a highly proactive approach when it comes to building the next-generation infrastructure. Particularly when it comes to managing large mission-critical legacy applications in the cloud, IT will need to find a way to not only integrate them into multi-tier, multitenant environments, but to ensure that reliability and availability are maintained across increasingly distributed architectures.
Of course, open platforms go a long way toward the establishment of broadly interoperable environments, but they are by no means vital. Top platform providers like HP have been quick to embrace OpenStack and other formats, primarily because they already provide large portions of the hardware, software and services portfolios that the enterprise needs to build cloud architectures. And since nobody can truly own the cloud, it makes sense to be first among equals when it comes to supplying the tools and technologies of the new data paradigm.
For the enterprise, however, interoperability in the cloud has more to do with application and data portability than with integration of hardware and software platforms. It will most certainly be possible to foster broad portability across distributed systems without the blessings of OpenStack, CloudStack or any of the other open platforms in play, provided the correct APIs are in place. And in a way, this will afford the enterprise more control over data and resource deployment because new infrastructure will require a more formal introduction to existing infrastructure in order to ensure compatibility.
Still, maintaining a cohesive environment across highly distributed infrastructure isn’t likely to be a cakewalk. As the enterprise offloads more infrastructure to third-party providers, the management focus will naturally shift from servers, storage and networking to application and data interoperability.
This won’t necessarily make life for IT any easier, but it will open up vast new possibilities for the knowledge workforce.