Software-Defined Data Center Challenges

    Slide Show

    Interest in Software-Defined Networking Rising Sharply

    It seems like barely a week goes by that there isn’t another development in the software-defined data center.

    But as the advancements keep piling up, one thing is becoming clear–or less clear when you think about it. As more and more vendors, developers, systems integrators and data operators and providers enter the field, the more muddled it becomes. What once appeared to be a fairly straight-forward, albeit highly technical, means of extending the benefits of hardware virtualization across both localized and distributed infrastructure is quickly becoming a mish-mosh of platforms, architectures and design philosophies that could very well end up destroying the broad universality that the technology was supposed to engender.

    In this way, software-defined tech is no different from the many IT evolutions of the past. Yet it is still painful to see another golden opportunity for widespread infrastructure interoperability slip through the data community’s grasp.

    Though I am about to discuss IBM and its latest software-defined technology plans, by no means do I wish to imply that Big Blue is looking to corner the technology for proprietary purposes. In fact, more than most, IBM seems to have pursued a cooperative strategy across all manner of virtual platforms–preferring to foster multivendor ecosystems rather than end-to-end solutions. And as Chris Preimesberger, my colleague at eWeek, points out, the company’s new Software-Defined Environments group–made up largely from the earlier Application, Integration and Middleware unit–is still highly engaged in the open source OpenStack project for the bulk of its software-defined product development.

    Of course, the devil, as they say, is in the details. And according to Jamie Thomas, the group’s new GM of software-defined systems, the end-game is nothing less than a complete reset of IT infrastructure. As she explained to, complete data environments will soon be encapsulated and automated through scripts, much like software solutions are today. This shouldn’t come as a big surprise, because once infrastructure becomes just another facet of software, then it can be controlled and manipulated like software. The question remains, though: Can IBM, and the Openstack/OpenFlow community in general, really implement this kind of change through multivendor agreements and promises to all play nice with each other?

    The flip side of that question, however, is whether the more proprietary approaches of VMware, Cisco, Oracle and others will be any better. VMware, for example, offers a full suite of systems–vSPhere, vCloud, VFabric, vCenter–all designed to provide a coordinated, integrated approach to software-based everything. As another colleague, Paul Shread, points out, since it already owns much of the virtual server action within most data centers, VMware is in a unique position to simplify infrastructure, not merely mask its complexities the way the operating systems does for the PC. Once the enterprise gets knee-deep in a VMware-style software-defined environment, though, will that help or hurt its ability to integrate with the wider data environment that will surely encompass both proprietary and non-proprietary systems from multiple vendors?

    It should come as no surprise, however, that both the open and proprietary vendors are confident of their success, if only because software-defined technology is the best way for IT to meet the needs of 21st century data users. A recent survey of federal IT managers by MeriTalk showed widespread support for software-defined infrastructure, with more than 80 percent saying it was vital to their agency’s ability to function. Unfortunately, that message is not getting through to the money managers, who still earmark close to 80 percent of IT budgets to updating and maintaining legacy systems.

    The point to all this is that software-defined everything is not likely to be the solution that solves all your data management needs, but instead a multitude of solutions that will require a fair amount of systems testing, integration and deployment configuration–much like today’s data infrastructure. The software aspect should simplify the process a bit and make for a more robust, dynamic, data ecosystem, but in the end, we’ll still be dealing with the same IT industry as before–the one in which individual vendors place business models and profit margins ahead of user needs every time.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Latest Articles