Are We at Container 2.0 Already?

Arthur Cole
Slide Show

5 Essential Elements in Building an Agile Data Center

Could it be? Even though the technology has barely made it to the test bed at most enterprises, are we already launching into the Container 2.0 era?

According to Mesosphere, we are. The company recently released two new upgrades to its Data Center Operating System (DCOS) – dubbed Confluent and Lightbend Reactive that allow organizations to run both stateless and stateful applications within the same environment. As Forbes’ Janakiram MSV explains it, the goal is to bring web-scale applications and data-centric workloads under the same fold, allowing for nifty things like integrating multiple microservices with Kafka, Cassandra, Spark and other Big Data platforms. At the same time, this should make it easier to run traditional data center applications alongside their more nimble cloud-native cousins.

Meanwhile, a company called ContainerX is out with a multitenant Container as a Service (CaaS) platform that supports both Windows and Linux workloads. According to Computerworld, the system offers a self-deployed, self-managed environment through a single-pane management interface that can stand alone or be integrated into legacy management stacks. The system provides full orchestration across compute, network and storage infrastructure and supports key functions like elastic container clusters while maintaining full resource isolation to enable high utilization. The system is available for free up to 100 cores.

The OpenStack community is also taking steps to kick its support for containers to the next level. Developer Mirantis recently entered into a partnership with Intel and Google to rewrite key elements of the platform’s code to incorporate the Kubernetes management system for Docker-based environments. TechRepublic’s Matt Asay notes that this will give Google a significant boost in its drive to gather the bulk of enterprise cloud workloads while at the same time giving OpenStack a real shot at simplifying the deployment and management of hyperscale cloud environments. The hope is that this will also bring some stability to OpenStack so that integration and orchestration burdens do not mount as deployments increase in scale.

And with little fanfare, a company called Apcera has introduced a Trusted Cloud Platform that maintains a stable policy engine for containers no matter where they reside. This is significant because it allows the enterprise to maintain consistent security, availability and functionality as container workloads transition between on-premises and cloud infrastructure. In most cases, policies are applied manually, which introduces the prospect of human error and time-consuming deployment processes in distributed, scale-out environments. The Trusted Cloud Platform provides an overlay on the data center network so that policies applied behind or in front of the firewall abide by overarching connectivity and communications rules. By enabling a robust set of operational rules, the enterprise can provide high isolation for its containers to prevent unauthorized access. This should be particularly useful for highly regulated industries like finance and health care.

Does all this amount to a Container 2.0 paradigm shift? It’s probably too early to tell. Even the great epochs of history were not easily identifiable as they were changing.

But it does mean that even as the enterprise takes the first tentative steps into container-based abstract architectures, the wheels are already spinning to make the entire layer more flexible and more attuned to the kinds of workflows that will mark data performance in the digital services economy.

No matter what you call it, that can only be a good thing.

Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.


Add Comment      Leave a comment on this blog post
Aug 12, 2016 11:44 AM Shann Shann  says:
I'm not surprised that container 2.0 is right around the corner! It just goes to show how quickly people want to move on to bigger and better things. With highly dynamic environments changing at a rapid pace, it's crucial for monitoring solutions to become smarter. Dynatrace is a great example - it's powered by artificial intelligence to auto-detect issues and provide root cause analysis in real time. You can spend more time preparing for the container 2.0 paradigm shift while Dynatrace AI does all the heavy lifting. :) You can learn all about Dynatrace at: Reply

Post a comment





(Maximum characters: 1200). You have 1200 characters left.



Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.