Who, or What, Controls the Software-Defined Data Center?

    Slide Show

    Key Principles to Web-Scaling a Network

    The data center is becoming more software-defined, with distributed, cloud-based architectures making bricks-and-mortar facilities appear more and more like single computing units, basically building-sized PCs, tied to a globally networked infrastructure.

    So it shouldn’t come as any surprise that the selection of the software platform, or operating system, for the data center is emerging as one of the most important decisions on the agenda, eclipsing concerns about server, storage and networking hardware.

    At this point, it seems that the only certainty when it comes to data center software is that it will have to be based on open standards. That makes Linux the default choice, given that it already owns a good chunk of the legacy data environment. Red Hat executives have not been silent on this subject, with top names like Paul Cormier, president of worldwide products and technology, crowing at the company’s 2015 Summit in Boston recently that “Linux has won the data center.” The next step, he says, is to push open source across the entire operating and application development infrastructure.

    And there is certainly no shortage of Linux options for a wide range of functions. The Linux Foundation recently teamed up with PLUMgrid to form the IO Visor project, which seeks to unite vendors, distributors and hardware manufacturers under a common programmable networking framework. Naturally, the system will be based on PLUMgrid’s IO Visor platform that ties directly to the company’s Director service for internetwork connectivity, so it is essentially a vendor-driven implementation of the Linux kernel. But this is not necessarily a bad thing, given that even an open-source data environment requires some level of integration between systems and services in order to alleviate deployment and management hassles on the user end.

    Even the mainframe is joining the Linux fold. As I mentioned yesterday, IBM is utilizing the LinuxOne operating system that the company developed with Canonical for its new Emperor and Rockhopper machines. In this way, the systems offer a ready-made Big Data environment that can then scale outward under an open framework. The company is even backing an Open Mainframe Project designed to foster the kind of development and support community that is already emerging around distributed white-box architectures.

    Data Center

    But does all this Linux activity really mean that it will eventually control the compute clusters that will power next-generation data activities? Perhaps not, says The Platform’s Timothy Pricket Morgan, at least not in the way most people think. Indeed, there will be multiple layers of software control, each of which is likely to be inhabited by a different platform. OpenStack, for example, manages pooled resources at home and in the cloud, while application schedulers like Mesos help users navigate through complex and highly dynamic architectures. Then there are the container-level solutions like Docker and Kubernetes, which open up an entirely new vista of Linux-based functionality within virtualized ecosystems. Exactly where the enterprise chooses to locate the “brains” of its software-driven data architecture will be a difficult call, and it could very well reside on multiple layers depending on data, application and user requirements.

    All of this is going to make the good, old, hardware-centric days of enterprise infrastructure look quaint, like a Model A Ford next to a Lamborghini Aventador. The real test will be functionality, however. Software developers have been talking pretty big over the past few years about how much more effective and efficient they can make the data center. Now, it’s time for them to put their words into action.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Latest Articles