More

    A Banner Year Ahead for Containers

    Slide Show

    The Five Dos and Don’ts of Virtualization

    Of all the technology initiatives taking shape for the coming year, none is as potentially up-ending as containers. The “new virtualization,” as some are calling it, stands to not only remake enterprise architecture as we know it, but to produce a dramatically different application and services environment than we have now.

    Containers are likely to get a boost in enterprise deployments now that Microsoft has added support in the new Windows Server 2016 platform. Microsoft still rules the roost when it comes to enterprise-class software and operating systems, so it probably has more influence than most when it comes to integrating containers into legacy data environments.

    The platform now supports containers in three ways: through the Hyper-V hypervisor, aboard Windows Server itself, and on the Docker Engine for Windows. Each of these mechanisms can be selected upon deployment, so enterprise can tailor specific environments based on isolation, flexibility or other requirements. At the same time, the company is fostering greater application compatibility through ASP.NET and other frameworks.

    Meanwhile, the Linux community is working to create a more streamlined development environment for containers given the turmoil that has arisen between leading developers like Docker and CoreOS. The Linux Foundation is putting together a nine-member technical governing body under its Open Container Initiative that would act as referee over disputes. Meanwhile, founding members of the group, including Docker, CoreOS, Google and Huawei will oversee technical development of container runtimes and other specifications.

    The reason containers are so intriguing is their ability to support microservices – little bundles of code that can be mixed and matched on the fly to perform all manner of functions. This produces an entirely new application consumption model, says KEMP Technologies’ Jason S. Dover, because instead of choosing a pre-defined offering from an app store, users, including automated applications, will be able to compile the specific functions they need and deploy them on highly nimble virtual architectures. Not only does this enable highly dynamic scalability and deployment flexibility, it also produces a just-in-time delivery model for key services – an essential function for companies looking to target the often short-lived business opportunities driven by the Internet of Things.

    This may sound heavenly, but there is danger here as well, says Chef’s Julian Dunn. As with any emerging technology, there will be a temptation to containerize everything, regardless of whether the application or service at hand will truly benefit from this level of virtualization. Legacy apps, for example, might not take to containers so readily. As well, containers tend to break down tried-and-true dev/ops principles, which could leave IT in a bind if it suddenly has to run patches, reboots, performance tuning and other tasks on thousands of containers, each of which carries a full Linux stack. Containers certainly have the potential to do wonderful things for the enterprise, but not if their environment is designed poorly.

    In 2016, then, the enterprise will need to concentrate largely on container deployment and interoperability, both with other containers and with legacy infrastructure. In the years that follow, however, optimization and automation could very well create a new style of enterprise computing that will put today’s architectures out to pasture.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles