From the media reports, it may seem like there is a mad rush to the cloud. But in terms of actual deployments, it's been more like a leisurely stroll through the park.
At this point, only the smallest fraction of enterprise applications are on the cloud, and even those can't be considered vital to enterprise operations. Considering the kind of change that the cloud represents, this isn't all that surprising.
But most observers would agree that one major stumbling block to widespread use of the cloud is the fear of vendor lock-in. Even in traditional data center environments, few IT managers would recommend going all-in on one of the major platform providers, despite the soup-to-nuts offerings from HP, Cisco and others. Such an environment would certainly alleviate much of the integration and interoperability issues that arise in multivendor environments, but there's a certain level of comfort in knowing that your options are open and that third-party systems can still provide the edge you might need to meet your goals.
That's why the concept of open cloud platforms is so appealing, even if the implementation is proving rather difficult. At the moment, a handful of seed projects are up and running. Organizations like Cloud.com are toying with integrated IaaS solutions said to accommodate multi-vendor cloud frameworks, such as VMware's vCloud and Amazon Web Services. You also have traditional enterprise vendors providing internal cloud infrastructure components said to provide open cloud access. Xsigo comes to mind with its new VP560 I/O Director, which the company claims is interoperable with a wide range of hardware and software products on the market.
The question is whether this is enough. As we've seen in the past, it is very difficult to foster open source communities in environments dominated by a few top players. And with Amazon, Microsoft, Google and, increasingly, RackSpace calling the shots, it would take quite a counter-push on the part of the open source community to make any real headway. With the cloud is still in its infancy, says ZDnet's Paula Rooney, that effort will have to come soon before the playing field is laid out by the big guys.
Organizations like the Open Cloud Initiative and the Open Cloud Consortium are key parts of the process, but standards organizations generally deal with mostly paper and theories. What's needed is an actual test lab that can move interoperability out of the design phase and into simulation. This is the only way to prevent actual deployments from being plagued by the bugs and integration issues that has made open source such a turn-off in the past. IBM may have the ideal facility in Singapore, according to Ostatic's Sam Dean. IBM probably didn't have open source on its mind when it joined with Singapore's Infocomm Development Authority on the project, but that fact that it's open to universities, government agencies and other institutions might produce enough support for OS initiatives to make real headway.
Open source is an effective tool for closed data center environments. But if the cloud is to live up to its promises, there will have to be a robust interoperability mechanism to allow for maximum scalability and flexibility to accommodate increasing data loads.
An open environment would work wonders to that end. But getting there will take a lot of cooperation.