More

    Delving into the Downside of HCI

    Hyperconverged infrastructure (HCI) is undoubtedly easier to implement, less costly and more flexible than tradition data center configurations, but that does not mean it is trouble-free.

    Indeed, as more and more enterprises jump on the HCI bandwagon, many are learning that, as with most data-related issues, the trouble starts when you scale things up.

    The key drawback to HCI is the fact that it requires linear scaling; that is, compute, storage and network resources are typically added in equal amounts even though, as is usually the case in highly virtualized environments, storage and networking requirements rise much faster than compute.

    This imbalance is most acute in software-only approaches to HCI, which utilize advanced management and abstraction tools to oversee commodity, white-box hardware, says Dell EMC’s Alan Atkinson. But there are some things that software cannot correct on its own. In a fully optimized solution, for instance, storage subsystems must be designed from the ground up for performance and reliability, while the server itself must be able to distribute data across numerous autonomous drives in order to maintain consistent levels of cache performance. This doesn’t happen with a general-purpose server, which is typically designed to optimize the highest average level of performance for each compute cycle.

    Even with linear scaling, however, adding and subtracting resources is still easier than in a distributed architecture and will likely be cheaper even if certain resources are over-provisioned, argues ActualTech Media’s Scott D. Lowe. While it is true that storage resources will tend to scale faster than compute, this will very likely turn out to be a temporary problem as vendors are already realizing the need to implement storage- and network-heavy modules. In the very near future, the enterprise will likely have a range of options to configure highly customized HCI solutions based on their workloads and application portfolios.

    But the problems with scaling storage in an HCI environment go much deeper than simple resource imbalance, says Storage Switzerland’s George Crump. Replication and VM portability are also starting to cause headaches for the east-west networking architectures that exist in most deployments. At the moment, enterprises have two choices for replication: straightforward data copying or more advanced erasure coding techniques. The problem with the former is that it requires more capacity – sometimes on the order of 3x – while erasure coding is more computationally complex and consumes more network resources, which hampers overall performance. There are ways to architect deployments around these problems, of course, but that diminishes the plug-and-play advantages that modular infrastructure is supposed to provide.

    HCI also presents unique security challenges. According to BitDefender’s Liviu Arsene, a legacy firewall approach will not provide adequate protection in the software-defined architectures that inhabit HCI. Instead, organizations will have to shift to application-based policies that can more effectively adjust to the rapid flexibility of modern data ecosystems. At the same time, you’ll have to employ multi-vendor solutions in scale-out environments and implement limited access to control-plane data to thwart attempts to compromise entire clusters.

    None of this is to suggest that HCI is not a worthwhile endeavor or that it does not provide real, appreciable benefits to data providers and users. But it does represent a dramatically different form of data infrastructure, with ramifications far beyond simple provisioning and management.

    Understanding the myriad ways in which HCI will alter the data environment, both on the systems and operational levels, goes a long way toward ensuring that the changes it brings will be positive ones.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles