Enterprises that wade into hybrid cloud infrastructure quickly come to a startling reality: Virtualization and hardware abstraction do not eliminate data silos all by themselves.
In fact, many organizations are finding that the integration challenges in the cloud are even greater than in the data center, if only because cloud infrastructure is expected to support a higher degree of data dynamism as a core capability.
But whether the goal is simple data bursting or a fully integrated, distributed IT stack, it is clear that the hybrid cloud will remain a work in progress for a while longer.
The rise of on-demand services and real-time analytics is posing a particular challenge to hybrid environments due to the presence of large amounts of streaming data. Platform developer Striim is targeting this need with a real-time data integration and streaming analytics system (now in version 3.7) that enables rapid data movement from on-premises infrastructure to the cloud. The software now includes direct integration with various Microsoft data solutions running on the hybrid cloud, including Azure Blob Storage and Azure File Storage, to allow streaming data collection and dynamic schema evolution for SQL Database deployments.
Other developers are turning to artificial intelligence to manage the metadata needed to integrate the hybrid cloud. Informatica’s new Claire module in the Intelligent Data Platform promises end-to-end data management for organizations embarking on the transition to a digital services-based business model. Claire – a play on the word “clairvoyance” – is designed to imbue Informatica’s entire suite of data management products with machine learning and intelligent automation to better absorb and interpret technical, business, operational and usage metadata generated by distributed infrastructure. Not only does this reduce management overhead, it provides for greater, more accurate decision-making due to a higher level of visualization of disparate data sets.
In many respects, hybrid integration is about maintaining data availability across diverse architectures and platforms. To that end, Veeam Software just teamed up with N2W Software to provide a cloud-native, agentless backup and availability solution for multi- and hybrid cloud environments. The system combines N2W’s AWS-facing Cloud Protection Management system with Veeam’s Availability Suite to allow enterprise customers to seamlessly copy data from AWS to a Veeam repository for operational backup and cross-platform disaster recovery. In this way, organizations will be able to maintain access to data across hybrid architectures in support of emerging service-, application- and data-layer functionality.
Integrated management platforms are certainly crucial to hybrid cloud performance, but the enterprise will also need to adopt new approaches to data organization and the role applications play in strategic business objectives, says Primary Data CTO David Flynn. Most organizations structure initial hybrid infrastructure around relatively simple functions like archiving. But even here, there needs to be a deep-dive examination of application lifecycles, data location and any potential migration issues that could arise during normal, or even abnormal, business operations. Not only will this help streamline infrastructure and lower resource consumption, but it should improve business continuity and application performance in general by allowing organizations to more fully leverage their scalable cloud infrastructure.
In all likelihood, hybrid cloud environments will prove to be as diverse as the traditional data environments they replace, with each enterprise crafting customized solutions to give their products and services an edge in the marketplace. By the same token, a fully integrated hybrid cloud will likely prove elusive as there will always be pockets of data that cannot be easily reached, either by design or neglect.
But given the right management stack and the right approach to data oversight, hybrid infrastructure should emerge into a highly integrated data ecosystem – if only because the cost will not be worth the bother if it doesn’t.
Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.