Building a Container-Friendly Infrastructure

    The enterprise finds itself tasked with catering to the whims of an increasingly demanding user population. The world today expects services of a highly esoteric nature to be delivered speedily and reliably, and there is little tolerance for failure.

    Delivering this level of service is driving many organizations to deploy new container-based architectures and the new class of microservices they support. But container-friendly infrastructure does not spawn from a vacuum, and as many enterprises are learning, it takes a bit of legwork to upgrade legacy environments to effectively support microservices architectures.

    One of the first things to change, says Tibco’s Matt Ellis, is the enterprise’s internal organizational structure. In a recent webinar highlighted on RT Insights, Ellis says that before organizations embrace solutions like the Tibco Integration Stack, they need to adopt a DevOps model of continuous delivery/continuous integration of apps and services, preferably based on team-based workflows that accommodate the enterprise’s preferred integration patterns (B2B, API, MFT) and deployment models (on-premises, cloud, IoT). From there, the enterprise can then employ various cloud-based and hybrid infrastructure solutions and the requisite management systems that maintain a cohesive environment.

    Microservices will also require a new approach to networking, says Ranga Rajagopalan, CTO of Avi Networks. The speed and flexibility of the container allows applications to be broken into multiple constituent parts, which provides for an effective development environment because individual functions can be mixed, matched and updated without pulling the entire app offline. But this also requires broad east-west connectivity to allow containers to communicate directly, rather than the north-south flow of traditional data architectures. This means the enterprise will have to invest in elastic service fabrics, distributed software load balancers and real-time network data feedback to ensure flexible and reliable connectivity across distributed environments.

    Compared to today’s monolithic software architectures, of course, containers and microservices provide a highly dynamic and scalable development ecosystem. But there are some drawbacks. For one, says AppDynamics’ Josh Symonds, they require a high degree of infrastructure automation, which is only just starting to permeate legacy systems. As well, getting data from service to service in a consistent manner will be a challenge – most likely requiring a set of microservices just to maintain the canonical representation of object messages and authentication data. In most cases, however, organizations will find that the upside of containerized environments outweighs the downside.

    Enterprise executives will also soon realize that containers are not the preferred solution for all apps and services. As MapR’s Dale Kim noted on Data Informed recently, containers are not good for applications that require persistent data. With no means for long-term data storage, the container loses its files when it is shut down – even if the intent is to redeploy on a new server. Sure, you can store data on a host file server system, but this inhibits the container’s portability because its movements must now be coordinated with a centralized analytics cluster. And with today’s volumes and high-speed activity, traditional storage solutions like SAN, NAS and DBMS are operationally and financially non-starters. Development in persistent container solutions is evolving, however, so there is a good chance that a viable option will emerge at some point.

    Containers and microservices will most certainly comprise much of what is turning out to be the next-generation data environment, and the rise of abstract, software-defined infrastructure will allow this transition to unfold at a rapid clip.

    And once the real work of converting legacy infrastructure is complete, the next challenge will be to leverage microservices in ways that appeal to today’s discerning data consumer.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles