Stacking Up the Container Revolution

Arthur Cole
Slide Show

Debunking the Top Data Center Myths

There is nothing quite so scary in the IT universe than tearing down what you just built in order to make way for a new technology. Well, perhaps complete and utter network failure, but that’s about it.

But with the advent of containers, it seems like the enterprise is on the cusp of reworking one of the fundamental elements of the cloud, converged infrastructure, software-defined infrastructure, data mobility and just about every other initiative that is driving data center development these days. Fortunately, container-based virtualization does not require a forklift upgrade to the virtual layer, but it does alter the way virtual machines are managed, and it could cause a massive rethink when it comes to devising the higher-order architectures that are slated to drive business productivity in the future.

To some, however, it was traditional virtualization’s limitations in supporting advanced data architectures that led to the rise of containers in the first place. As Virtualization Review’s Jeffrey Schwartz put it, there was growing consensus that the application loads of elastic, cloud-based platforms and applications were already pushing the limits of even the most advanced virtualization platforms, and what was needed was a higher degree of portability, speed and scale. Containers achieve this by allowing a single operating system to handle multiple apps at once, which is a much more elegant solution than deploying numerous virtual machines each populated with its own OS.


You would think Docker, the container darling of the moment, would pose a direct threat to VMware in particular, given that its status is the top virtualization provider in IT today. But the company has in fact embraced Docker, going so far as to integrate its container solution into vSphere where it is said to improve performance of some, but not all, applications. This is probably a case of embracing change before it bowls you over, but VMware still deserves credit for recognizing the validity of an alternate approach to virtualization and then making it easier for customers to leverage it.

Docker is already starting to draw a third-party development ecosystem around itself as the IT community wakes up to the transformation it represents. A company called StackEngine recently emerged from stealth with a container automation platform designed to oversee container deployment and lifecycle management in production environments. It works by offering broad visibility into the container itself and then coordinates both its access to bare metal resources and the interactions between various hosts, providing a means to dynamically alter the container environment as application and workload requirements change. StackEngine is written in the Go language, which is also the basis for Google’s Kubernetes orchestration platform, and is likely to hit the channel by the end of the year.

The big mistake in assessing container-based virtualization’s role in the enterprise is to think it is a full replacement of the standard virtual machine. As Pernixdata’s Frank Denneman notes, the choice should reflect the nature of the service you wish to provide, rather than which form of virtualization is “better.” In short, it comes down to which layer you want to leverage for service delivery and management. High-scale, high-availability applications, for example, will likely thrive in a container – others would benefit more from a standard virtual machine with services coming from the surrounding virtual infrastructure. After that, there is the question of running containers within VMware or some other virtual stack, or simply leverage a public PaaS for your service architecture. Either way, we are talking about shifting from an infrastructure-centric to an application-centric environment, which will require a new way of looking at things from an IT perspective.

So, to be clear, we are talking about a major shift in the way we view the virtual layer and its role on the cloudy, portable data environment that is taking shape. But it doesn’t mean the enterprise needs to undo anything that hasn’t been done so far – maybe just tweak the software a bit.

In that light, containers represent something we rarely see in IT: tremendous gain without a lot of pain.

Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.



Add Comment      Leave a comment on this blog post

Post a comment

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

null
null

 

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.