Managing the Transition from Legacy to Forward-Facing IT

    Slide Show

    9 Rules for Digital Transformation in the Enterprise

    The enterprise is currently trapped between two worlds: the emerging world of agile development, digital processes and unlimited scalability, and the legacy world of fixed hardware/software architectures and rigid, silo-based infrastructure.

    The problem is, while everyone is anxious to move forward into flexible virtual and cloud environments, most critical business applications are rooted on legacy systems and cannot be discarded at the drop of a hat.

    There are ways to bridge this divide, of course, but for most enterprises it comes down to two main factors:

    • Is continued reliance on legacy infrastructure worth the time and expense it entails?
    • If not, how can the transition to new infrastructure take place without disruption to critical business applications?

    A key consideration in migrating legacy infrastructure to modern application and data loads is extending the automation and orchestration that characterizes next-gen architectures into the existing data environment. Cisco just came out with a new platform called Tetration that attempts to do this through artificial intelligence and advanced workflow optimization. The system, in fact, is modeled largely on the way in which telecom carriers support varied services across their networks, utilizing real-time analytics to track data packets between switches and managing the dependencies that arise between software components running on virtual machines. In this way, says Datacenter Knowledge’s Scott Fulton III, Cisco is trying to unify the previously distinct functions of infrastructure and services monitoring.

    Meanwhile, Gartner has been preaching the benefits of “bimodal IT” for a number of years, although defining it on paper and implementing it in production environments are two very different things. In short, the concept differentiates between a predictable IT mode and an exploratory mode, with the long-term outlook centered on full convergence between the two. But as InfoQ learned from a number of IT leaders who took part in a recent roundtable discussion, both the pros and cons of the bimodal approach will vary greatly from enterprise to enterprise depending on the state of legacy infrastructure, application requirements and business service goals.

    Without question, legacy infrastructure will have to be upgraded if the enterprise hopes to maintain in-house resources in support of emerging applications. In this way, initiatives like bimodal IT will come to rely upon increasingly modular and converged hardware that can provide both the stability needed by legacy apps and the flexibility to support emerging functions, says VMware’s Skip Bacon and Hitachi Data Systems’ Bob Madaio. The end game is to design a fully software-defined data center (SDDC) that can be configured in multiple ways to support a wide range of data services without requiring the time or expense of a hardware retrofit. At the same time, it gives the enterprise a distinct advantage when the time comes to pivot to the next big shift in data functionality, which will undoubtedly arrive sooner or later.

    Indeed, as BizTech notes, a converged infrastructure makes it easier to deploy virtually any architecture needed both now and in the future, and it reduces operational complexity and maintenance to a fraction of what is required in legacy data centers. As such, the enterprise will soon be able to define infrastructure according to its needs, rather than the other way around. This gives IT the ability to craft truly innovative solutions in conjunction with data users and other stakeholders as agile, devops-style approaches to service and application support take hold.

    As the saying goes, this will be a journey, not a destination. Enterprise infrastructure will be in a state of flux for some time, and even when the last vestiges of legacy environments are put to rest, the ongoing dynamism of the new ecosystem will make it difficult for organizations to get too comfortable with any given development, no matter how innovative it seems at launch.

    This will be to the enterprise’s benefit, of course, because it finally means that data operations can remain in sync with the changing economy into the foreseeable future.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles