Are Containers and VMs Destined to Be Together?

    Slide Show

    Six Ways Open Source Benefits Your Business

    The enterprise has grown quite accustomed to virtualization, so comfortable, in fact, that virtualized infrastructure is now routinely used to support mission-critical applications.

    But you didn’t think the tech industry would allow you to get too comfortable, did you? Along come containers and immediately word starts to spread that this is the perfect solution to rid the data center of all that clunky virtualization stuff.

    Whether or not containers will lead to wholesale replacement of virtual infrastructure will likely be an ongoing debate in the coming years. But if recent developments are any guide, containers will continue to make strides in emerging data architectures even as virtualization vendors like VMware seek to incorporate them into their broader cloud-facing platforms.

    Docker’s emerging support for software-defined networking and other components of abstract architecture is a case in point. With the recent acquisition of SocketPlane, the company is poised to distribute its container technology across widely distributed infrastructure. At the same time, it has gathered plug-ins from everyone from Nuage Networks to Cisco – with VMware in the mix, as well – and is pursuing its own Swarm open source project designed to distribute containers across multiple hosts and allow them to communicate via a range of networking solution.

    Will this eliminate the need for virtual machines? Perhaps, but it is not necessarily a slam dunk.

    VMware has been busy building support for cloud-native applications with both the AppCatalyst hypervisor that features developer-friendly API and Command Line Interface (CLI) support, plus the Project Bonneville effort to provide seamless Docker integration for vSphere and vCenter Server. The idea behind the latter is to use the Instant Clone feature of vSphere to provide stripped-down virtual machines for individual containers so that they fit more comfortably within legacy VMware environments. In this way, organizations gain container flexibility and isolation while reducing overhead and complexity in the broader data environment.

    Fair enough, but wouldn’t it be better if containers could function on their own with no virtual layer at all? Perhaps, but this might not to be to everyone’s liking, says The UK Register’s Simon Sharwood. It turns out that developers love containers because they are lightweight and flexible, but operations prefer virtual machines because they are easier to secure and manage. The beauty of Project Bonneville is that it allows developers to function within native container environments without realizing they exist on a virtual plane while operations can manage them just like any ordinary VM. Everybody wins.

    Intel is getting into the act as well with a new solution called Clear Containers, says InformationWeek’s Charles Babcock. The company says it solves enterprise concerns over security while at the same time improving the performance of VM-housed containers. Like VMware, the idea is to run a stripped-down VM whose only purpose is to support a single container. The VM can be spun up in about 200 milliseconds and occupies only about 20 MB, small enough to fit 3500 within 128 GB of RAM. The Linux kernel that supports the container is minimal as well – about one-twentieth the size of a normal kernel. So in the end, users gain a full container kept safely within the hardened boundaries of a VM, all backed by hardware-supported security.

    Container supporters say that concerns over management and security are not different from the ones that arose in the early days of virtualization. Fear of the unknown always causes people to tread carefully, but once both the technology and the comfort level develop, containers will take their rightful place as the new virtualization.

    This isn’t an unreasonable assumption, particularly as data productivity becomes more dependent on cloud applications and microservices. But if the past is any guide, replacement of one technology for another is rarely total. So as long as both emerging and traditional data operations hold sway in the enterprise, there is no reason to think containers, virtual machines and hybrids of the two would not continue to thrive in the data center.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Latest Articles