More

    Fine-Tuning the Enterprise Container Environment

    If 2016 is the year the hybrid cloud entered the enterprise mainstream, 2017 could very well be the year that containers emerge as a core service development and deployment tool.

    At the moment, the buzz is high, which, if Gartner’s hype cycle model is any indication, will trail off over the next few months before the real work of integrating containers into the enterprise data stack begins. But even if the technology does not produce the universe-altering shift that early enthusiasts envisioned, it is still clear that it can change current data environments for the better in a number of ways.

    According to 451 Research, deployment of containers and container management systems is expanding at a rate of about 40 percent per year, which will more than triple its market value to $2.7 billion by 2020. The group says that containers are a key element in what it calls the “cloud-enabling technologies” (CET) market, which also includes virtualization, automation and other technologies. Compared to other open source elements, such as OpenStack, containers are clearly the fastest-growing, both in the cloud and related fields like DevOps.

    A few stumbling blocks remain, however, particularly when it comes to integrating container functions into legacy infrastructure. MapR recently introduced a new Converged Data Platform for Docker that provides for persistent storage across container deployments to more easily access files, tables, messages and other database elements from multiple locations. The aim is to allow for faster and more agile data access for stateful applications and microservices, which should streamline both the development process and overall management of the containerized ecosystem.

    Meanwhile, Red Hat has implemented a number of storage-related improvements to its OpenShift container platform that enable organizations to dynamically link container management systems like Kubernetes to more varied types of back-end infrastructure. Using the API-based Persistent Volume storage subsystem in Kubernetes, Red Hat provides an abstraction layer to integrate its own Gluster storage platform into OpenShift. This should allow users to connect containers to a range of storage tiers and provide the means to orchestrate these varied resources across NFS, iSCSI, Fibre Channel, and Amazon and Google storage platforms in support of hybrid cloud deployments.

    Another key hurdle in the expansion of container environments is security, particularly as container creation and deployment is increasingly carried out by automated systems. Docker recently added a container-native secrets management module to its Datacenter solution that should allow the enterprise to implement granular security and access control to the entire software supply chain. The “secrets” that the system oversees include API and encryption keys, passwords and other data that containerized applications need to access in order to function properly. Using a standardized management interface, the module provides a measure of “usable security” that can then support broad orchestration of services and microservices across distributed environments.

    At this point, it seems unlikely that the container juggernaut will be derailed even slightly. The issues facing the technology are largely centered in implementation rather than operation, and the advantages that containers bring to emerging IoT and cloud-based data environments are well established.

    All that’s left is the fine-tuning, which will become steadily less burdensome as the enterprise gains experience deploying and managing containers in their production environments.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles