Do you have your hybrid cloud yet? Are you planning to get one? If not, you may already be behind the curve, but that might not be a bad thing considering the immaturity of the technology and the still looming possibility that it might not do everything that it is expected to do.
To give credit where credit is due, even those who stand to gain financially from selling hybrid cloud services are working hard to prevent wishful thinking from getting ahead of reality. Rackspace, for one, is not shy in claiming that the hybrid cloud is the wave of the future, but cautions that it is not the ideal solution for all applications. Hybrids will most certainly cost less than traditional infrastructure, but security is more complicated--not less effective, just harder to implement--and the ability to offload key applications can be hampered if the public provider does not offer the correct APIs.
Indeed, the notion that the hybrid cloud can operate as a de facto extension of the data center itself is still way overblown, according to Pariveda Solutions’ Tim Aranki. As he explains to Cloud Tech News, services like dynamic workload migration and outright data bursting are intriguing possibilities, but neither the implementation nor the ongoing management capabilities for such an environment are ready yet. Still, that doesn’t mean today’s hybrid architectures can’t provide an economical solution for lower-tier workload support.
The big question in hybrid circles is whether to go open source or not. Companies like Rackspace and Red Hat are not shy in touting the benefits of using open source technologies in hybrid environments. The most primary benefit is the ability for enterprise users to add third-party applications of their choosing, rather than limit themselves to the cloud provider’s offerings. Red Hat, for instance, now offers a full Linux OpenStack platform that, when paired with the RH Cloud Infrastructure service, is expected to drive full hybrid IaaS capabilities across enterprise, ISP, cloud provider and even telecom environments. It is important to note, though, that using open source products does not always result in easy integration or full interoperability.
So far, however, most hybrid cloud strategies center on a single cloud running multiple applications. But it isn’t hard to imagine multiple clouds all trying to coordinate application and data loads in an integrated fashion. As IT Business Edge blogger Mike Vizard reported this week, Skytap is already taking on this challenge by adding support for Network Address Translation (NAT) to its service portfolio. NAT offers tools like one-to-one address mapping that can be used to distribute applications across on-premise and public clouds. At the same time, the company has added a new command-line interface (CLI) and single sign-on capabilities to allow IT managers to synchronize the activities taking place across disparate infrastructure.
It is all but certain that enterprises will incorporate hybrid clouds into their overall data infrastructure to a high degree. Integration and other issues will no doubt generate a wide variety of solutions, some more elegant than others, that will soon reduce the technology’s status from cutting-edge novelty to standard practice. But it is also clear that the hybrid cloud will serve as just one option in a wide-ranging data strategy, with pure public, pure private and even traditional data center architectures serving vital roles as well.
Hybrids, then, represent a future of the enterprise, not the future.