Interest Grows for the Data Center Operating System

    Slide Show

    Software-Defined Networking and the Enterprise

    On the PC, various hardware and software resources must function in harmony in order to produce something useful. This is why smart people invented the operating system.

    In the data center, you have pretty much the same resources – compute, storage, networking – except on a larger, more distributed scale. Most data centers feature any number of management systems, many of which are optimized for a particular resource or application, and this has served as the data center operating system to varying degrees of success.

    But now that the data center is about to be redefined from hardware to software, and then distributed not just across a building or a campus but across town and around the world, the need for a cohesive data center operating system is becoming evident.

    So far, only one company, Mesosphere, has claimed the mantle “data center operating system” for its platform, which seems to be working out for them considering the interest it has generated among leading enterprise vendors. Both HPE and Microsoft joined the company’s latest financing round that added a cool $73.5 million to its coffers, and this comes on top of several years of keen interest in the platform, says eWeek’s Jeffrey Burt, with Microsoft floating an outright buyout late last year. As infrastructure moves off of local data center hardware and onto virtualized cloud architectures, anyone who owns the operation layer will likely dominate the data market for the next decade or two at least – a lesson that Microsoft learned in the 1980s.

    But while other companies may not describe their platforms as an “operating system,” the functionality is largely the same. Stratoscale, which recently received backing from mobile silicon developer Qualcomm, bills itself as a builder of hardware-agnostic Software Defined Data Center (SDDC) solutions, effectively offering the enterprise a means to dynamically provision, configure and manage data resources in an automated, real-time fashion. The company’s Symphony platform provides what the company calls “rack-scale economics” through a self-optimizing process of continuously matching workloads to available resources. In this way, organizations can focus on core business functions without devoting a lot of time and resources to supporting data infrastructure.

    Another newcomer to the field is Nimbula. The company recently released the Nimbula Director 1.5 cloud operating system that links multiple, geographically distributed clouds under a single management pane, allowing the enterprise to dynamically deploy workloads according to cost, configuration or other metrics. The system utilizes a policy-based automation engine capable of multi-tier, multi-tenant functionality, and offers a high degree of customization with zero-touch installation of third-party drivers and management software like RHEL6. Director 1.5 is available now for existing customers and will see a general release in the third quarter under a free license for up to 40 cores.

    And speaking of RHEL, you may not have noticed but Red Hat has been quickly expanding the capabilities of its Linux distribution from the server to the data center to the cloud, says ZDnet’s Steven Vaughan-Nichols. On the way, the company has leveraged the leading open-source platform into a $2 billion revenue stream and has gained a dominant voice in the direction of the open cloud industry. The company is banking on the idea that while proprietary solutions offer easy installation and integration, open source offers the highest degree of customization, which is what emerging business will need in order to differentiate themselves in an increasingly data-dependent economy. By focusing on the development, deployment and lifecycle management of applications across the cloud, the company hopes to play a leading role in pure-play software-defined architectures.

    So just like in the PC era, data performance is quickly becoming a function of the operating system. It is still unclear whether a single management stack, even one dubbed a data center OS, can corner the cloud market the way Microsoft did for the desktop, but the fact remains that a single, unified management stack will provide the ease of operation that the enterprise needs in order to stop worrying about infrastructure and start concentrating on business.

    And if a single solution does rise to the top, be prepared to crown a new king, or tyrant, of the data industry.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Latest Articles