Toby Owen of Peer 1 Hosting identifies four drivers for the hybrid cloud:
- Federation
- Interoperability
- Big Data
- The Internet of Things
But covering data issues changes your perspective on these things, because when you boil it down, most technology is about sharing, securing or using data and information. Since information is just unstructured data — it really all boils down to data. So I look at that list and see only two drivers:
- Shared services (supported by federation and interoperability)
- Data. Just data.
My premise gets some support from a 451 Research paper on hybrid cloud integration architecture:
“The term ‘hybrid cloud’ has emerged to describe the relationship between any number of cloud computing services with in-place enterprise IT systems. It covers a broad range of scenarios, such as integrations between on-premises applications and data sources, virtualized managed services, services for sensors and mobile devices, and any type of private or public cloud, including IaaS (typically used to avoid capital outlays for servers, storage and networks), PaaS (typically used for application development and runtime operations) and SaaS (typically an outsourced application such as CRM).”
IoT? Data-driven. Mobile? It’s about the services if you’re a user, but organizations love the data. SaaS applications? Data and services.
“Hybrid cloud is important to big data because bare metal servers tend to be more suited to big data software like Hadoop, which is about horizontal scaling, and doesn’t always require specialty technology like the cloud,” Owen writes. “That means organizations embracing big data will still need traditional servers and hosting environments, which they will need to integrate with their cloud infrastructure by using a unified hybrid cloud platform.”
Likewise, only the hybrid cloud can support the flexibility needed for “the really BIG data generated by the IoT.” What’s more, Owen adds, the hybrid cloud allows you to leverage the data from legacy infrastructure without modernization.
Owen isn’t the only one who sees data as a key driver for the hybrid cloud. BlueStripe’s COO jokes that the hybrid cloud “might as well be called the Hybrid Data Center.” IT Business Edge infrastructure blogger Arthur Cole includes that clever remark in his own excellent exploration of the hybrid data ecosystem, then elaborates on the idea:
“It is a subtle change of perspective but it helps to focus attention on the fact that the cloud is a secondary consideration to the need to craft real-world solutions to existing challenges. As well, a hybrid data center strategy should make it clear that changes will be required along the entire data stack, including the application layer, which will need to embrace the challenges and opportunities of a distributed data environment.”
That means applications will need to support the flow of critical data across a disparate infrastructure, he adds. Thus, one reason for a hybrid infrastructure may indeed be a hybrid data center.
That’s basically virtualizing your data center, contends former Microsoft Windows chief and Andreessen Horowitz board member Steven Sinofsky. Sinofsky isn’t keen on hybrid clouds, actually.
“History clearly shows that betting on bridge solutions is the fastest way to slow down your efforts and build technical debt that reduces ROI in both the short- and long-term,” Sinofsky writes. “The reason should be apparent, which is that the architecture that represents the new solution is substantially different — so different, in fact, that to connect old and new means your architectural and engineering efforts will be spent on the seam rather than the functionality.”
Sinofsky recommends going to the cloud and leaving legacy as legacy, but the fact is, that’s also a key reason companies virtualize data.
Indeed, creating an abstraction layer for the cloud and other new technologies is one of the biggest drivers for data virtualization, Denodo’s Senior Vice President for Data Virtualization Suresh Chandrasekaran told me recently.
“Lots of companies have modernization of applications for the cloud, mobile, etc. They have to move very fast, focusing just on the application logic and the UI,” Chandrasekaran said. “The data services team allows them to do that by providing a virtualization layer.
“It’s being adopted more by large companies as a unified data layer for operational and analytical use, and as an abstraction to enable them to adopt these modern technologies like mobile and cloud and big data and NoSQL.”
For more on hybrid clouds and data, check out my previous post, “Hybrid Clouds Mean New API Integration Challenges.”
Loraine Lawson is a veteran technology reporter and blogger. She currently writes the Integration blog for IT Business Edge, which covers all aspects of integration technology, including data governance and best practices. She has also covered IT/Business Alignment and IT Security for IT Business Edge. Before becoming a freelance writer, Lawson worked at TechRepublic as a site editor and writer, covering mobile, IT management, IT security and other technology trends. Previously, she was a webmaster at the Kentucky Transportation Cabinet and a newspaper journalist. Follow Lawson at Google+ and on Twitter.