When enterprises are talking about themselves these days, it looks like “Converge and Conquer” is becoming the new mantra.
Although, it’s probably a bit premature to say converged infrastructure (CI) is sweeping the IT industry just yet, but the writing is clearly on the wall. Whether the challenge is Big Data, power consumption, mobile computing or what have you, converged or modular infrastructure is shaping up to be the best way to handle it. Indeed, now that virtualization is turning into software-defined everything, enterprises are finding that it is easier to deploy these new architectures on gleaming new modular systems rather than force-fit them into legacy infrastructure.
And the dollars are clearly starting to flow toward CI platforms. According to IDC, the market more than doubled to $1.4 billion in the third quarter of 2013 and is set to explode to $17.8 billion by 2016. And this is just the beginning. As Gartner noted recently, the $80 billion data center hardware market is still 90 percent dominated by sales of individual server, storage and network solutions, but as early as 2015 one-third of all servers shipped are expected to be part of a converged solution.
CI is one of the lifelines that companies like HP are reaching for as they navigate the shift from traditional hardware solutions to the software-defined future. The company’s Project Moonshot platform is in the hands of early adopters and is likely to see broader distribution by the end of the year. That, plus advanced automation and renewed focus on services and cloud-ready architectures, are said to be key elements in a multi-year turnaround.
Even when it comes to traditional enterprise functions, converged infrastructure seems to provide a lift. According to a recent Wikibon report, Oracle database administrators claim that productivity improves by as much as 50 percent when solutions are deployed on converged platforms. Interestingly, the improvement is not necessarily a factor of operational efficiency but in the way firms are able to rework business processes and leverage critical data more effectively, particularly when it comes to the development of innovative new applications.
But since this is entirely new infrastructure we’re talking about, the question of whether it should be proprietary or open source is front and center. Clearly, open source fans view CI as a real chance to finally break the hold that leading platform vendors have had on the enterprise for so long and are looking to efforts like Facebook’s Open Compute Project to foster a broad ecosystem of multivendor hardware infrastructure. But while this makes sense for Facebook, Google and other hyperscale users, the enterprise industry at large is not likely to follow a similar path. Many organizations have long relied on key vendors for core computing environments and we shouldn’t think that such a long-standing practice would come to an end simply because internal infrastructure is to be converted to a modular footprint.
It is likely, then, that CI will emerge as the stepchild of traditional enterprise infrastructure, but will ultimately become standard practice as the shift toward software-defined, scale-out architectures continues. Economics being what they are, the pressure on CIOs to streamline data operations is most intense when it comes to physical infrastructure. Put that house in order, and it becomes that much easier to achieve the broader goals of cloud computing and the dynamic data environment.