Now that the enterprise has gotten its feet wet with DevOps, one salient fact has become clear: It’s not terribly effective without end-to-end automation at scale.
This is leading many forward-leaning organizations to push for the implementation of DevOps environments on the cloud, since scaling up local resources is both expensive and time-consuming. But this bumps into one of the major challenges with cloud infrastructure, namely, that it is very difficult to automate resources that you don’t control.
According to Gartner, scaling automation broadly and deeply throughout the IT ecosystem is one of the top challenges to establishing a successful DevOps environment. Ideally, the automation stack should be able to discover all available resources and utilize them wherever they reside. This requires the establishment of a highly visible architecture both in the data center and the cloud, which frankly is not the easiest thing to do, nor the cheapest. Without this, however, organizations will not be able to realize the full benefit of DevOps and will inevitably fall behind in the digital services economy.
A separate study by the Ponemon Institute on behalf of Embotics offers a deeper view on the challenges of automating a DevOps environment on top of a hybrid cloud. In a survey of more than 600 IT leaders, the report shows a distinct lack of confidence in the ability to manage both cost and risk on public cloud infrastructure even as the importance of DevOps and the development of a container-based microservice infrastructure are seen as critical needs going forward. This led researchers to conclude that the key inhibiting factor of cloud-based DevOps workflows is not a desire on the part of IT but a lack of tools and skillsets to make it happen.
But this may be about to change. Like any business, MSPs know an opportunity when they see one, and many are rolling out new systems and service tiers aimed specifically at drawing highly lucrative DevOps traffic. One of the latest additions is Seattle’s 2nd Watch, which recently unveiled a pair of core automation tools to streamline and accelerate DevOps projects at scale. The Modern CI/CD Pipeline and the Machine Image Factory services are designed to run on AWS, Azure or multi-cloud infrastructure, allowing organizations to integrate existing DevOps platforms like Chef and Puppet on the cloud. The company claims this can reduce development and deployment time up to 75 percent while lowering the overall cost and complexity of resource management.
Software companies are turning their attention to cloud automation as well. XebiaLabs recently released a new framework that cloud providers can deploy to better support DevOps workloads. The system is designed to address the critical pain points in deploying DevOps at scale, such as the need to access cloud resources on demand, intelligent control of complex release pipelines and a standardized, technology-agnostic process environment. At the same time, the system allows organizations to extend automation across distributed architectures through detailed visibility, advanced dependency management and granular process control.
DevOps is not the kind of technology initiative that can be implemented in half-measures. If the entire process is not automated across all available infrastructure, then the results are not likely to be worth the investment. To date, the ability to automate the entire cloud has been lacking, but fortunately this little oversight is already on the fast-track to resolution.
Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.