Making DevOps Work on Legacy Infrastructure

    DevOps works best in agile, automated and largely software-defined infrastructure. This puts most enterprises in a bind because even now, nearly two decades into the modern virtualization era, much of IT is still steeped in legacy systems that are anything but agile and automated.

    So before an organization even begins experimenting with DevOps, it helps to contemplate the future of this aging technology. Is it best to just scrap it and launch new services on the cloud? Or is there a way to incorporate legacy infrastructure into the new operational paradigm? Answering these questions correctly could mean the difference between a successful, efficient and cost-effective transition to DevOps or a convoluted, unproductive mess.

    This first thing to consider when applying DevOps to legacy infrastructure is, what do your users want? According to Skytap Chief People Office Jill Domanico, this will likely fall across generational lines. While older workers may be perfectly content with the way things are, younger people are already steeped in the flexibility and availability of mobile, cloud-facing data environments. Surveys have shown that millennials in particular weigh an organization’s technology capabilities very heavily before accepting a job, which is why much of the young talent these days is flocking to start-ups rather than established companies.

    Unfortunately, the range of options for bridging DevOps across legacy and next-generation infrastructure is rather limited. Amdocs, a software developer for media and entertainment firms, recently released AmdocsOne, a DevOps deployment engine that uses machine learning and microservices to span both legacy and cloud infrastructure. The platform utilizes Amdocs’ existing relationship with AWS and the AtomIQ intelligent engine to enable rapid, autonomous service delivery across hybrid clouds. There is a smattering of similar solutions in the channel, but by and large, DevOps platforms like to focus on newer infrastructure.

    This leads us to the question of what kind of legacy infrastructure is suitable for DevOps and what should be mothballed. Legacy systems that are not available as virtualized pools of resources, for example, will have a tough time keeping up. As Wipro’s Sameer Mital notes, an Agile Integrated DevOps platform that mixes applications, infrastructure and crowd-sourcing solutions requires organizations to pull diverse assets together quickly and effectively, so anything residing in fixed hardware silos will remain out of the loop. But ultimately this can be detrimental to DevOps processes and the eventual products they create because lack of critical, or even non-critical, data can hamper performance in rapidly evolving production environments.

    This is why you should not circumvent enterprise IT entirely, says Travis Greene, director of strategy for IT operations at Micro Focus. The so-called NoOps movement is built on the idea that anything local is out-of-date and the only right way to proceed is through cloud-based SaaS solutions. This leads to problems, however, by creating divides between existing and emerging services that will eventually interfere with the enterprise’s ability to leverage emerging initiatives like the IoT, advanced business intelligence, regulatory compliance and even security.

    It seems, then, that as usual there are no clear answers when it comes to applying DevOps on legacy infrastructure. Partly, this is due to the fact that there are no clear lines between what is merely legacy and what is truly obsolete.

    One thing is certain, however: Enterprises that wish to compete in the new economy must apply tools like DevOps to the entire data ecosystem. And that means any infrastructure that is not expected to play a role going forward must be relieved of its data and decommissioned sooner rather than later.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.


    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles