Container technology is quickly emerging as the main theme of enterprise infrastructure development this year, but as with any new technology, inevitably a high degree of confusion surrounds its capabilities and its role in emerging data architectures.
Containers essentially pull a reverse on traditional virtualization – rather than creating multiple virtual machines to run a plethora of operating environments, containers let you run multiple applications on a single operating system. The advantages range from greater flexibility and resource utilization to lower licensing costs and vastly improved application portability. The downside, however, is that there are still too many questions regarding implementation, integration, management and the technology’s overall impact on the data environment.
One of the biggest misnomers, in fact, is that Docker is the only viable container solution for the enterprise, says Cloud Technology Partner’s David Linthicum. Aside from the fact that Google, Amazon, Microsoft and other hyperscale providers are devising their own container solutions, CoreOS recently unveiled its Rocket platform designed to compete head-to-head with Docker. But it is still largely unknown exactly how these platforms will help or hurt the enterprise, and in all likelihood the benefits/detriments will range from the substantial to the insignificant depending on data loads, application configurations, and a host of other factors.
Docker, for one, is attempting to spell out the container roadmap as quickly and clearly as possible, if only to hold on to the 10-fold annual growth rate it has enjoyed so far. The company recently launched Docker Hub Enterprise, a sandbox of sorts that allows developers and system administrators to toy with preconfigured Linux containers optimized for various business applications. The company has signed up Microsoft, IBM and Amazon as partners, with IBM offering specialized middleware that allows users to create containers in-house and then access additional services through the cloud.
Docker also recently received some help on what many considered to be a crucial flaw in the platform – the lack of a viable backup and recovery solution. Asigra has announced support for Docker in its Asigra Cloud Backup platform in order to provide end-to-end data protection as containers assume a greater role in the enterprise application environment. The agentless solution provides advanced features like AES-256 encryption, autonomic healing and policy-based optimization and can scale from single- to multi-location infrastructure designed on either grid-based vault or client architectures.
And to be sure, container success stories are starting to trickle out. One is online fashion house Gilt Groupe Holdings, which has leveraged the technology to improve application development and deployment by segmenting individual app components and then updating them as needed. With containers, the company merely replaces one image for another rather than pulling down entire virtual machines, reducing a minutes-long process to mere seconds. And if the change does not perform as expected, it is just as easy to swap the old image back in.
Are containers the answer to all your dev/op problems? Hardly, but they do offer a novel way to make better use of existing virtual infrastructure in order to better serve the rapid-fire, gotta-have-it-now style of the mobile, collaborative workforce.
At this point, the drawbacks seem manageable while the benefits could be profound.
Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.