Containers and DevOps: Looking Into the Fine Print

    Slide Show

    5 Essential Elements in Building an Agile Data Center

    Most enterprises are at least in the planning stage for a DevOps model of data operations by now, which means containers are looming large on the list of priorities in the technology refresh cycle.

    But how do containers augment the DevOps architecture, and are there any deployment or configuration options when optimizing a working environment?

    According to Forbes’ C.K. Oliver, container-based workflows are essential to flexible and agile development across the entire application lifecycle. Not only do they help in the creation and deployment of apps, but in the continuous integration/continuous development that will be the hallmark of high-speed, highly dynamic digital infrastructure. With tools like Docker, Kubernetes and Mesos in hand, agile development teams consisting of users, developers, IT techs and other stakeholders will have a greater capacity to mix and match a wide variety of microservices and discrete application code modules to create in minutes what once took months to accomplish.

    But a word of caution, says Corey Quinn, director of DevOps at investment firm FutureAdvisor: Not all containers are created equal, and it could be difficult, if not impossible, to impose a container environment on apps and infrastructure that are not ready for it. Just as the cloud is best suited to cloud-ready functions, containers work best with container-ready systems, so the enterprise will have to consider carefully how it intends to containerize legacy infrastructure or implement the technology across greenfield deployments. As with any major architectural shift, it is better to look before you leap to gain full understanding of migration issues, points of failure and long-term goals and objectives.

    It is also unwise to view the container as the next-generation virtual machine, says Red Hat’s Lars Herrmann. Containers may be lightweight and lend themselves to DevOps very nicely, but issues surrounding image stability and portability make it difficult to deploy containers across diverse infrastructure. A VM, on the other hand, reaches out to underlying hardware to define its own environment no matter where it goes. Another issue is container visibility, which is still under development. Without a means to drill down into a container and its contents, developers are introducing security and other types of risk into the enterprise data environment with each deployment.

    These issues will become more prominent in a few months with the release of Windows Server 2016, says tech consultant Michael Otey. By introducing containers into the most popular data center operating environment, Microsoft is adding yet another option for defining and supporting a wide range of enterprise functions. While the platform does not support Linux containers, it does support its own container either directly on Windows Server or on a Hyper-V virtualization layer for greater security and isolation. So not only will DevOps teams need to consider the various operational considerations between bare-metal and virtualized containers, but there is also the potential for container silos arising between Windows and Linux, even if the container management solution supports both environments, like Docker does.

    Without doubt, containers provide a number of key advantages over existing architectures for many enterprise functions, but they are not the solution to every problem. While it may seem that containers provide a simpler development and test environment for certain apps, they may not prove optimal once that goes live and is in need of ongoing support.

    In the end, the one thing that containers do not change is the need for experienced teams to determine when, where and how their digital creations are to be developed and deployed.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.


    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles