DevOps is all about getting agile, but in order to do that, the enterprise needs an infrastructure that is agile as well. Simple virtualization is certainly better than bare-metal resources, but to really kick things into high gear, you’ll need to supports apps within environments that are portable and easily managed.
Enter containers, and their operational payloads, microservices. With the ability to support a complete runtime environment without the extra baggage of a full operating system, containers offer the ability to provide the app with what it needs to run while still enabling rapid dissemination across hybrid clouds.
The advantages this brings to DevOps are clear, says MeriTalk, a government IT newsletter. For one thing, DevOps itself does away with traditional monolithic application development in which multiple integrated modules form a single product that cannot be decomposed or shared with other apps. Instead, DevOps teams are able to continuously upgrade product piece by piece, pushing new code into production environments without disrupting the overall service. This is difficult to do with a virtual machine, but it’s a snap with containers.
That is, of course, if the enterprise has an effective way to track and manage them. Kubernetes seems to have taken a leadership position when it comes to container management, but it can be difficult to implement across hybrid infrastructure without the right support from platform and service providers. This is why companies like CloudBees are upping their commitment to Kubernetes through the Cloud Native Computing Foundation. The company now offers full support of Kubernetes on its Jenkins Enterprise platform, providing users with an enterprise-class continuous delivery platform across multi-cloud architectures.
Kubernetes is not the only container management solution, however. When Netflix first began experimenting with DevOps a few years ago, it relied on Kubernetes for orchestration and scale management. Ultimately, however, it built its own management platform called Titus on top of Mesosphere, which itself is built on the open-source Apache Mesos kernel. In this way, Netflix can better incorporate legacy applications into containerized architectures. It’s important to note, however, that Mesos now supports Kubernetes in addition to its own Marathon orchestration engine, so the enterprise is no longer faced with stark choices as to which platform to deploy.
Still, the advent of containers and serverless computing is making it difficult to predict the outcomes of DevOps projects, says Nicki Watt, CTO of tech consultancy OpenCredo. As she tells Jaxenter, this means the enterprise will have a tough time keeping track of where and how their services and microservices are being used without better application reliability and improved visibility into runtimes, preferably through new intelligent management systems. This level of functionality will require not just new technology, however, but new skills in the DevOps work team.
Deploying DevOps projects onto containerized architectures has its share of challenges, but the operational benefits are well worth the effort. With multiple teams working on multiple microservices that can be mixed and matched with other services, the potential for truly revolutionary new digital creations is extraordinary.
Meanwhile, users no longer have to wait for the next big release to enjoy these new capabilities, or deal with cumbersome downloads and setup. They just appear in the course of their normal activities.
Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.