New Platforms Tackle Container Storage Challenges

    Virtualization revolutionized server infrastructure in the data center, and now containers are about to do the same for virtual resources in the cloud. And while both introduce new layers of abstraction to support increased application and data flexibility, they also produce the need for substantial changes to storage.

    In containers’ case, this change comes in the form of a way to provide distributed access to persistent storage. Containers are ephemeral in nature, so any data they hold when decommissioned is lost, which is a particular problem when it comes to supporting distributed relational databases and other applications that require stateful data.

    In a recent study by Portworx, the enterprise is poised to move containers into the mainstream. Nearly a third of recently surveyed organizations say their container budgets for the coming year top $500,000, and that number is likely to increase now that Windows Server provides native support for Docker. Still, more than a quarter of respondent say the need for a persistent storage solution is the toughest challenge for containers, topping both data management and the complexities of multi-cloud support.

    Portworx addresses these challenges using a container data service layer that provides persistent storage on local SAN/NAS infrastructure and in the cloud.  A key element in this approach is a global file namespace that tracks data across distributed architectures so it can be accessed, and secured, in a consistent manner. Lately, says CIO Insight’s Mike Vizard, the company has been working with medical researchers and other high-power computing (HPC) users, where containers leveraging persistent storage volumes are proving highly adept at data-intensive functions like genomics without introducing the performance lag of traditional virtualization.

    Meanwhile, open source platforms are starting to address the disconnect between containers and persistent storage. A company called EasyStack recently introduced a clustering solution that allows the enterprise to integrate both OpenStack and the Kubernetes management system into a unified container ecosystem. ESContainer provides for usage-based resource allocation for stateful applications, leveraging the OpenStack Cinder module to provide a persistent storage backend using either open source or vendor-defined plug-ins. In this way, organizations can maintain a default persistent storage mechanism for all containers, plus extend support for Kubernetes to legacy storage environments.

    Persistent storage is also arriving in lightweight platforms suitable for bare metal, virtual or cloud-native distribution. StorageOS recently released the public beta of its storage software solution that provides policy-driven distributed storage for Docker deployments. The system has no hardware or kernel dependencies and features an intuitive user interface that allows developers to deploy containerized database applications with high availability in a matter of seconds. The company says it is aiming to provide the same level of portability for stateful applications that containers currently bring to stateless apps.

    The need for persistent storage on containerized architectures points out one of the central conundrums of modern data infrastructure: It has to be both flexible and highly reliable. This is roughly equivalent to the need for a mobile home with a concrete foundation and steel frame.

    The only viable solution is through increased abstraction of compute, networking and storage resources, albeit at the expense of increased complexity that in turns leads to higher latency and a more challenging security structure.

    The true test of emerging persistent, stateful container solutions, then, is not devising the proper architecture, but to ensure that its operational benefits can be maintained as workloads scale to production levels.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Latest Articles