The enterprise may still have high hopes for the hybrid cloud, but technology challenges are preventing many organizations from gaining all the benefits they had originally expected – particularly the ability to burst data and extend applications across local and distributed infrastructure.
But while many firms are gravitating toward all-public platforms for their data needs, is the solution to the hybrid’s problems right around the corner?
Many developers are touting containers as the key to effective hybrid management. By encapsulating a full runtime environment within each container, the enterprise should gain the level of portability across disparate resources that make geographic location irrelevant to overall performance. It may not be a complete solution (not yet, anyway), but many boosters say it does provide the scalability and centralized control that is needed for agile business models.
Hybrid infrastructure developers like Elastifile say containers bring new levels of flexibility to abstract data environments and allow the kind of data-centric management capabilities that foster dynamic operational environments. A containerized workflow, after all, needs a management stack that is persistent and highly available, seamlessly scalable and universally accessible across clusters, sites and clouds. A container’s ability to support a distributed file system fulfills all three requirements, providing highly granular integration between on-premises and public cloud resources.
Red Hat is working toward adapting its container platform for hybrid workflows as well, says Cloud Pro’s Clare Hopping. The new OpenShift Container Platform 3.4 allows organizations to utilize Docker containers and the Kubernetes orchestration platform to allocate resources wherever they are needed and then deploy applications on multiple platforms across hybrid architectures. The system incorporates Red Hat’s Gluster software-defined storage solution to support both stateful and stateless applications under a single data environment. In this way, the company says it can support legacy and forward-leaning workflows under a single, overarching management system, while at the same time enabling collaboration across isolated, multi-tenant Kubernetes namespaces.
Other hybrid cloud developers are utilizing containers as a core element in their goal to establish seamless interoperability between clouds. A California company called HyperGrid recently took the wraps off its HyperCloud solution that uses containers to “lift and shift” Java and .Net applications across distributed environments. The system automates many workload deployment tasks, such as application dependency management, service discovery and auto-scaling, and utilizes HyperGrid’s own fabric-based infrastructure to enable seamless migration between third-party public clouds.
Will these and other container initiatives be enough to bring the hybrid cloud into the mainstream? It’s too early to tell, says InformationWeek’s Charles Babcock, but it better happen soon because the enterprise is quickly souring on the hybrid promise. According to the 2017 State of Cloud report, support for private clouds has dropped 28 percent since last year’s report, while planned deployment of public clouds has nearly doubled. Interest in containers is quite high – about half of respondents are planning to use them in the coming year – although actual usage is in the single digits, and many IT executives are still parsing out the relative merits of leading container management solutions like Swarm, Kubernetes and Cloud Foundry.
Containers are designed around the concept of portability, so not only are they a boon to hybrid infrastructure but on-premises and multi-cloud environments, as well. As the enterprise data environment gravitates toward the cloud and distributed micro datacenters in support of IoT and Big Data applications, it is likely that containers will provide the basis for dynamic workflow management.
And ultimately, this should allow the enterprise to eliminate concerns over underlying infrastructure altogether, shifting the focus instead to optimizing data and application performance in pursuit of better business outcomes.
Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.