Enterprises that have migrated workloads to the cloud are quickly coming to realize that even virtualized, third-party infrastructure does not in itself provide the flexibility needed to meet emerging data requirements. This is particularly true in single-cloud environments in which resources and configuration options are limited to what the cloud provider has developed for generalized consumption.
This is why multi-cloud architectures are expected to make a big play in the coming year. By distributing data and applications across varied infrastructure, the enterprise can better tailor resources to the appropriate workload and reduce the risk of stranding workloads in cloud-based silos.
The challenge, of course, comes in managing the multi-cloud environment. Hybrid clouds, by nature, are designed to provide portability and federation across disperse resource sets, but how advanced is this technology really? And does it provide the kind of seamless level of operation to truly propel data productivity to a new level?
It’s certainly possible, says Beta News’ Tony Connor, but it takes an experienced IT team to pull it off. This creates a catch-22 for the enterprise because even as operational responsibilities diminish in the data center, they can increase in a multi-cloud environment. So in the end, the enterprise does not gain anything either operationally or cost-wise because a scaled-up environment will quickly hit the budget hard and knowledge workers still have their hands full managing infrastructure rather than creating new features and supporting customers. A managed services provider that takes on full responsibility for the entire cloud stack can help alleviate this problem, but at this point it is difficult to provide a truly integrated environment between multiple MSPs.
New in-house platforms are starting to take aim at multi-cloud architectures, but the field is still in its infancy and isn’t likely to produce a fully seamless environment for some time, says tech consultant Andrew Froehlich. Most management tools today support commonly implemented architectures from leading providers like AWS and Microsoft, and they usually require the enterprise to alter their cloud policies to fit these general-purpose environments. For a more customized, all-encompassing environment that can span cloud-native and legacy applications, the enterprise will most likely have to wait until 2018 or later. At this point in time, therefore, it’s probably best to concentrate on building a solid framework for multi-cloud operations and worry about the fine-tuning later.
Ultimately, says Red Hat’s Alessandro Perilli, the goal should be a single management interface for the entire IT stack. Just as most homeowners would prefer a single remote for their TV, DVD player, gaming system and, increasingly, the lights, security system and appliances, so too will the enterprise find it easier to manage data through one piece of software. Again, though, this is easier said than done, especially considering the steadily increasing intricacy of the modern data environment. Expect automation to play a key role in this effort, particularly as the knowledge workforce comes to expect greater self-service in its dealings with infrastructure.
Cloud providers will also have to do their part to foster greater interoperability among themselves or else risk losing workloads back to the data center, says MarkLogic CEO Gary Bloom. The enterprise long ago realized the limitations of outsourcing, so they are naturally leery of providers that lock them into proprietary APIs and service management stacks. The rise of containers, in fact, should make it easier to port services and microservices across remote infrastructure, giving more power to the enterprise to craft its own data environment using providers that embrace the concepts of cloud neutrality.
Nobody wants to be locked in to someone else’s technology, of course, but neither do they want to push data across increasingly disparate resources that may or may not work together toward a common purpose. As Big Data increases the imperative to combine and compare data from across the data ecosystem, the need for multi-cloud interoperability will grow.
But it won’t happen overnight, and it won’t arise until the enterprise gains a clear understanding of where the data footprint lies and how the inconsistencies between multiple vendor platforms can be overcome.
Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.