Managing the HCI Transition

    Slide Show

    To Converge or to Hyperconverge: Why Not Both?

    Hyperconverged Infrastructure (HCI) is making serious headway into the enterprise and all signs are pointing to rapidly expanding implementation in the coming years. But as with most technology evolutions, the challenges don’t lie in deploying and provisioning the new equipment as much as in managing the transition from the old way of doing things to the new.

    With HCI, organizations are torn between the lower costs and streamlined operations that HCI offers vs. the need to support legacy workloads while simultaneously, and non-disruptively, moving users over to the new processes that leverage tightly integrated infrastructure and scale-out cloud functionality.

    According to Gartner, the seeds of the technology’s transition from peak to trough in the hype cycle are already in place in the form of multiple myths that will become apparent only as organizations enter the deployment phase. For one thing, says analyst George Weiss, the openness of HCI platforms generally stops at the codebase, so the supposed commoditization of HCI is not all it is cracked up to be. As well, depending on the deployment parameters, HCI does not necessarily represent the most cost-effective, scalable nor operationally efficient means of data and application support. And except in rare cases, don’t expect HCI to fully replace legacy infrastructure – at least in the short-to-medium term.

    When applied correctly, and for the right reasons, however, HCI can provide a very effective solution for a wide range of commercial applications. PeoplesBank of Holyoke, Mass., for example, found that SimpliVity’s OmniCube platform not only provides operational efficiency for key applications like backup and recovery, it also improves employee training processes when tied to its legacy vCenter management console. Still, core banking systems are likely to remain on traditional infrastructure for a while longer, and the bank is proceeding cautiously when moving legacy apps to the new set-up – something that is made easier by the enhanced ability to implement robust test environments before going live.

    Indeed, one of the biggest surprises about HCI is not how it changes infrastructure but how it changes processes and procedures, says the UK Register’s Danny Bradbury. The simple fact that it removes much of the storage management functions that accompany SAN and NAS deployments means even small businesses can provision their own scale-out storage environments without a lot of on-site technical staff. With hardware management now supported on the hypervisor level, even novice data users find it easier to craft their own environments, test their own apps and implement highly customized solutions that improve their work performance and enhance their value to their employers.

    But just like there is always the right tool for the job, there is the right infrastructure for the use case. For HCI, these include applications like VDI, hybrid clouds and edge computing, which are difficult, if not cost-prohibitive, on traditional infrastructure. As Datamation’s Cynthia Harvey notes, HCI’s benefits of rapid deployment, low cost and agility must be weighed against drawbacks like component-level performance issues, inflexible scalability and vendor lock-in and support issues. As with most technology decisions these days, solutions should be driven by user requirements, not the other way around, which means that sometimes HCI will provide the most optimal experience, and sometimes it won’t.

    It is also important to understand that HCI is of minimal value without a high degree of virtualization providing resource federation and workload portability across its commodity components. And as most organizations have figured out by now, some applications perform better on bare metal than on abstract architectures.

    This means HCI will most likely find a home in both the public and private cloud, but it probably won’t become the de facto infrastructure solution across the entire enterprise landscape. For the time being, organizations will be dealing with a mixed bag of hardware, software and middleware solutions, and the trick will lie in identifying the right way to provide optimal support rather than the initial deployment and ongoing maintenance of the environment.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles