Containers Poised to Remake the Enterprise

    Slide Show

    Six Trends Shaping the Data Center in 2015

    Like virtualization, it seems that containers are going to work their way into the enterprise by stealth – that is, whether the people in charge of technology and infrastructure want them or not.

    Part of this is due to the advent of the cloud. The more the enterprise offloads data and applications to third-party infrastructure, the less it has to say about the make-up and configuration of that infrastructure. But part is due to the fact that, like virtualization, containers are making their way into leading data platforms where they will exert their influence through standard upgrade and refresh cycles.

    A case in point is container management firm CoreOS’s decision to integrate Google’s Kubernetes cluster management system into its new Tectonic platform. According to ZDnet’s Steven J. Vaughn-Nichols, this will enable the enterprise to manage Linux containers within their data centers in scale-out cloud fashion and by extension foster compatibility with existing Google applications that are almost universally housed on containers managed by Kubernetes. As the enterprise gravitates toward private clouds, particularly Linux-based clouds, an integrated container stack will be crucial for the delivery of applications and microservices to a diverse workforce. Other Linux developers such as Mirantis and Mesosphere are also working to integrate Kubernetes into the platforms.

    Meanwhile, Red Hat is building up a broad ecosystem of container-based developers as a means to optimize the technology for the upcoming Atomic version of the RHEL distribution, says CloudHub’s Daniel Robinson. The company has established a certification program to establish trusted, secure application containers supporting Docker and the Docker Engine, which should make it easier to deploy a uniform container environment over virtual hybrid clouds. A key benefit to the enterprise will be the establishment of common application lifecycle functions, such as security and certification, across distributed architectures, providing a level playing field for all container-based apps and services.

    The shift toward a container-based ecosystem will be felt most keenly by development teams, says Tech Republic’s Nick Hardiman. With deployment and configuration of full data infrastructure now a simple matter of writing code, developers gain a free hand in crafting the systems and resources to support their creations from development, to test, to full-scale production. While it is true that solutions like Docker have trouble working across multiple hosts, the platform is continuously evolving through third-party contributions to extend its capabilities across complex data architectures.

    Data Center

    One of these is ClusterHQ, which has taken on the task of incorporating within the container the large data sets that most applications require to function properly. Currently, databases that support modern apps need to undergo major modifications to fit inside a container or they have to reside elsewhere, diminishing application performance. ClusterHQ’s Flocker platform embeds the database within a Docker container, allowing it to follow the app wherever it goes. The company is also working on a tool called Powerstrip, aimed at prototyping Docker extensions for distribution across virtual and cloud architectures.

    In light of all this activity, containers are poised to emerge as an integral component of the cloud, which itself is on the way to dominating IT infrastructure both within and without the data center. Virtualization laid the groundwork for this transformation, but containers will kick it into the high-speed, highly diverse data environment that will propel data productivity for another generation.

    But don’t make the mistake of thinking that since containers are inevitable, they do not warrant careful attention on the part of IT. On their own, containers can streamline infrastructure, but carefully orchestrated and optimized, they can deliver an entirely new level of application functionality – one that is vastly more suited to the rapid-fire data consumption of the emerging collaborative workforce.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles