Making the Most of Bare Metal in the Cloud

    Slide Show

    Building High-Growth IT: 5 Things to Know Now

    Virtualization and the cloud were the dominant trends in IT infrastructure over the past decade, and there is no reason to think they won’t support a significant chunk of the enterprise workload going forward. But alternate solutions are starting to take hold as well, including that old stand-by: bare-metal servers.

    In many cases, enterprises are pursuing mixed infrastructure solutions in order to maintain the diversity required of increasingly complex application and data loads. Bare metal in the data center, for instance, will likely hold out as long as the enterprise employs traditional productivity apps – which experts agree should be for quite some time. Alternatively, organizations are starting to see the benefits of bare-metal cloud solutions for critical workloads, even as the popularity of shared, virtual resources gains for nearly everything else.

    The bare-metal cloud market, in fact, is on pace to grow at an annual rate of 40 percent for the rest of the decade, says research house Markets and Markets. This will bump today’s $870 million market to more than $4.7 billion, all the while spurring demand for a host of enabling technologies such as non-locking compute and storage, fabric virtualization and identity and access management. Database services, in particular, are expected to be the primary driver for bare-metal clouds, many of which will utilize in-memory storage and workload consolidation tools to minimize resource consumption.

    Bare-metal solutions are also integral to the growing preference for multi-cloud architectures, so it is important that they be configured around high portability and orchestration. Rackspace, for instance, recently teamed up with Megaport to enable robust connectivity between its RackConnect architecture and public clouds like AWS and Azure. The idea is to allow even single-tenant bare-metal solutions to function within broader hybrid cloud environments while maintaining the high performance and reliability demanded by critical workloads.

    Even highly automated, software-defined architectures should incorporate some level of bare-metal provisioning, says tech consultant Keith Townsend. The more hypervisors you layer on top of bare metal, the greater the performance hit, so at some point it makes sense just to run a standard server configuration, particularly in stable but high-volume workloads. The key challenge going forward, then, is not just orchestration, but orchestration across the multi-vendor environments that exist within many enterprise and cloud data centers. This “orchestrator of orchestrators” approach is best carried out in one of two ways: either an overriding automation stack to govern multiple vendor-specific orchestration solutions, or a hardware-agnostic system from companies like Puppet or Chef.

    The increased use of containers is also likely to lead to more bare-metal provisioning. Management platforms like Kubernetes are starting to tout the benefits of running Docker or CoreOS workloads on native hardware rather than layering them atop a virtual environment, and this is drawing the attention of key server manufacturers. Supermicro, for example, has teamed up with Google, CoreOS and configuration management firm Datera to standardize a persistent container environment that can scale to dramatic proportions. As Datacenter Knowledge’s Scott Fulton notes, this will likely lead to a new class of preconfigured hyperscale server that can provide rapidly deployable bare-metal environments for microservices and other containerized applications.

    Just as iron, steel and aluminum bring key advantages and disadvantages to any building project, so too do native, virtual and container technologies to the enterprise. As organizations become increasingly dependent upon digital workflows in their core business models, the pressure will mount to tailor infrastructure to the needs of those workflows, not the other way around.

    That means IT must keep all options on the table when it comes to crafting the data environment of the future, even if it means reaching back to technologies of the past to do so.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Latest Articles