The idea of fully outsourcing data infrastructure to the cloud is still novel enough to give many CIOs the shivers. But now that end-to-end data environments can be configured entirely in software, the notion is not as radical as it once was.
At the very least, the precise location of physical infrastructure is becoming less of an architectural criterion given that functions like security, governance and resource configuration are proving to be less costly and more effective when they are deployed on the application or data planes rather than a box somewhere. So this has some people wondering if we are on the cusp of a quiet revolution toward full utility-style computing, not because it is the latest must-have technology but because it is the most efficient, effective way to run a data environment.
For those who say their data is too broad or too complex to entrust to third-party infrastructure, we have only to look at Netflix, which recently shuttered its last video streaming data center to port its entire service to AWS. The company still maintains some back-office processes in-house, but the voluminous video feeds – the heart of its user-facing operation – are now 100 percent in the cloud. The company has made no secret that, given the scale and complexity of its operations, it had no choice but to turn to Amazon for support, which includes not just massive resources but a growing cadre of specialty services and feature sets.
This is why many leading voices in the IT industry are saying that time is short for local data centers, even those being converted to private and hybrid clouds. Tech analyst David Linthicum says that at best private clouds will be the exception in the enterprise, not the norm, given that they fail to overturn the fundamental economic disadvantage of on-premises infrastructure: Unless you’re in the business of making money by providing resources to others, you will always pay more for less scale and outdated capabilities. Already, Amazon’s share of the public cloud market exceeds the entire private cloud market by about a billion dollars.
And indeed, many vendors of virtualized infrastructure are starting to foster environments in which users can all but ignore the physical layer altogether. Cloud software developer Nutanix is adding machine intelligence and other features to its platform in the hopes of building end-to-end data ecosystems on “invisible infrastructure”; that is, users don’t know, nor do they care, what kind of box they are running on because all of the critical functionality rests on the virtual plane or higher.
The new Acropolis 4.6 release features artificial intelligence that pretty much provisions its own support infrastructure based on what human operators tell it to do. The platform can also adapt to changing requirements and available infrastructure to provide continuous optimization on its own, essentially teaching itself how to provide top-notch support for application and data loads.
All of this is merely the end result of the software-defined data center (SDDC), which is closer to reality than ever now that the final leg of the three-legged infrastructure stool, networking, is becoming virtualized, says F5’s Gary Newe. While the focus in the cloud today is to enable Software-, Platform- and Infrastructure-as-a-Service, the end game will be full IT-as-a-Service, where systems, applications, resources and everything else needed to fill out the IT stack is available with a few mouse clicks. This will speed up time-to-deployment of new functions and capabilities, and in turn kick the entire digital economy into high-speed, even real-time, performance – as if the current pace of business is not fast enough already.
Even if this trend toward utility computing does not play out entirely, it is clear that data infrastructure is undergoing the most radical change in its 50-year history. The focus is no longer on bigger and faster, but on becoming more nimble and agile, allowing the enterprise to ramp up products and services to production-level scale in record time and at very low cost, and making highly efficient use of resources in the bargain.
It’s hard to see how things will get much better from that point, but if the IT industry stays true to form, somebody will figure it out in short order.
Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.