It seems that Web-scale computing, Big Data, software-defined everything and a rash of other forces are coalescing around a major shift in data center architecture that may very well put an end to legacy, silo-based infrastructure once and for all.
Demand for so-called converged infrastructure (CI) seems to be ramping up significantly in enterprise circles, most likely due to the fact that deploying advanced architectures like software-defined networks (SDN) and Hadoop clusters is easier on Greenfield deployments than trying to force-fit them into existing plans. And the cheapest and easiest way to build new infrastructure is through modular components.
According to Zenoss, nearly half of enterprises in North America are taking this route. In its latest survey, the company reports that 46 percent of organizations are already using CI in one capacity or another, a 53 percent increase over last year. As well, another 44 percent are planning to deploy a CI platform, with most of that group intending to develop their own solution rather than an off-the-shelf platform like Cisco’s UCS or VMware’s Vblock.
Still, it seems that, due to their channel presences and extensive R&D capabilities, the top vendors already have a lock on the market. Or do they? Smaller competitors are smelling blood in the water, and ample precedent has been set for seemingly dominant players to lose out in times of great change. This is what motivates companies like Nutanix, which recently unveiled a Vblock-to-Nutanix migration service aimed at drawing converts to its “hyper-converged” platform. The company is banking on the hope that continued strife within the VCE consortium will prevent both users and channel partners from fully supporting the platform. The migration program features a mix of services and hands-on support designed to first move data loads to Nutanix clusters and then provide support and training for their use, preferably at or below the cost of the initial VCE deployment.
At the same time, Hewlett-Packard is said to be in pursuit of hyper-converged platform developer SimpliVity, creator of the OmniCube system that packs not only compute, storage and networking modules into a commodity x86 server, but specialized functions like backup, dedupe and WAN optimization as well. HP already has the ConvergedSystems platform, which cobbles together various 3Par, BladeSystem and networking devices under the OpenView management stack. However, CRN’s Kevin McLaughlin notes that SimpliVity brings a modular, “building block” approach to CI that would allow HP to better compete against VCE, Nutanix and other systems. SimpliVity, for its part, has said it is not involved in acquisition talks with HP or anyone else.
All of the market machinations should not distract enterprise executives from the most salient fact of CI: It’s not about deploying one platform or another, but reinventing IT infrastructure as we know it. That will require a number of shifts in IT thinking, says VCE architecture specialist Archie Hendryx in his article for Sys-Con Media. First, realize that time-to-deployment of new services will be reduced drastically, so make sure you plan for a big enough capacity-on-demand buffer so users are not left wanting. As well, you’ll need to streamline key policies like security and data protection to reflect the more integrated, tightly structured hardware/software environment. The same goes for architectural and compliance standards, which should reflect the comprehensive nature of CI rather than the individual component approach of traditional infrastructure.
It turns out, then, that of all the changes hitting the enterprise, CI is the most fundamental. Not only does it provide a more streamlined—and if built properly—efficient infrastructure, but also a more scalable approach in support of the dynamic, highly collaborative data environments that are taking over the business world.
Despite what many people say, hardware still matters. It may remain behind the scenes once the transition to CI is complete, but its deployment and upkeep will remain a crucial aspect to a well-run data environment.