More

    Managing Containers in the Cloud

    Slide Show

    5 VM Routing Mistakes Made in Private Clouds

    With so many technology initiatives hitting the enterprise these days, it’s getting difficult to see exactly how they will come together to shape the data environment of the future.

    A case in point is containers and the public cloud. On the one hand, containers make it easier for the enterprise to support emerging applications and services within private cloud infrastructure, but on the other, they also allow public providers to tailor their generic infrastructure to targeted workloads.

    According to InfoWorld’s Eric Knorr, one of the most significant under-the-radar projects at the moment is the Cloud Native Computing Foundation, which is looking to turn the Google Kubernetes container management stack into a multi-cloud foundation for distributed workloads. The group is headed up by Craig McLuckie, who founded the Kubernetes project at Google and is now setting his sights on incorporating Facebook, Twitter and other hyperscale providers into the Kubernetes fold. If successful, it means enterprises may soon be able to launch containerized applications and scale them to unprecedented levels using cloud infrastructure across the globe.

    At the same time, container pioneer Docker has extended its reach into management across multiple clouds through the acquisition of Tutum, a service-based platform that provides orchestration across distributed architectures. The move puts Docker in charge of not only the creation and operation of its containers, but now their management in a coordinated fashion, which up to now was the responsibility of the enterprise’s in-house programmers, says TechCrunch’s Ron Miller. With Tutum, Docker can provide the visibility and management tools needed to enable a unified environment up front, lowering costs for the enterprise and encapsulating the entire container stack within a single dashboard.

    The OpenStack community is hard at work bringing container management within its operating environment as well. The recent Liberty release features the Magnum orchestration engine that streamlines the integration of containerized applications within the OpenStack cloud, as well as the Courier addition to the Neutron networking module, which oversees the crucial role of networking between containers. The group has also shed the integrated release governance model for a more free-wheeling “Big Tent” approach that should give third-party developers more leeway in picking and choosing the components they want to improve upon, including those that are closely related to container support.

    It might be tempting to think that with all this container activity going on, the cloud is getting ready to shed the boring, old virtual machine. That is not the case, however, as Server Watch’s Paul Rubens noted recently. Amazon, Google, Microsoft and other hyperscale cloud providers are all working up new approaches to VM performance in the cloud, such as Microsoft’s new GS-Series VM that leverages up to 32 cores with the 64 TB of storage and 80,000 IOPS of its Premium Storage service to provide hefty support for large SQL, MySQL and NoSQL workloads. Google is also out with a 32-core VM featuring up to 28.8 GB of memory and backed by the new Compute Engine Autoscaler function that dynamically matches resource consumption to load requirements.

    The cloud was always intended to provide a diverse environment capable of supporting all manner of data activity. Containers will no doubt play a part in this rich ecosystem, but it’s probably overreaching to think that the two will unite to define data architectures for the foreseeable future.

    With containers, you gain a highly efficient means to mix and match a wide variety of services and microservices in support of emerging data initiatives, but there will still be plenty of ways to architect environments for more traditional purposes – or those on the cutting edge.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles