More

    Loosely Defining the Software-Defined Data Center

    The software-defined data center (SDDC) is on the minds of many enterprise executives these days, but aside from the broad outlines of a fully virtualized infrastructure, there is little in the way of a clear direction as to how it should be architected or what operational parameters it should support.

    One thing is certain, however: Organizations of all sizes are starting to view legacy data systems as one of the chief impediments to improved digital services, which themselves are seen as vital for success in the next-generation economy.

    According to Wise Guy Reports, the SDDC market is projected to grow at a 22 percent compound annual rate between now and 2021, producing an estimated market of more than $81 billion. Not only is it expected to provide the scale and flexibility that many data users are getting from the cloud, the SDDC should also ease management overhead and provide more uniform resource allocation between compute, storage and networking, allowing organizations to streamline their consumption to more closely match their data requirements. Beyond that, however, the rationales for implementing the SDDC vary according to workloads, business models and a host of other factors, meaning that virtually every organization is likely to implement a highly customized version of the same basic concept.

    But even within specific functions, there are varied interpretations as to what constitutes “software-defined” and how it is best utilized, says Datacenter Dynamics’ Dan Robinson. In storage, for example, EMC’s ViPR platform allows users to manage multi-vendor arrays from a single interface – what some might call “storage virtualization.” On the other hand, an open source file system like Gluster FS enables pooled storage across commodity server nodes, which is more akin to the model driven by hyperconverged and high-performance systems. The same goes for networking, where software-defined networking (SDN) and network functions virtualization (NFV) operate under the umbrella of virtual private networks (VPNs).

    But even if an SDDC is not on the enterprise’s immediate to-do list, it can still influence the decisions surrounding IT infrastructure and architectural development. Windows IT Pro’s Richard Hay points out that with Windows 10 approaching the end of its extended support lifecycle, many organizations might want to save some money on licensing costs by upgrading to Windows 2012, which Microsoft will support until October of 2023, rather than the more expensive Windows 2016. That will make implementing an SDDC a little more difficult down the road, however, given the myriad Azure-friendly features that 2016 sports.

    One of these is the new Network Controller that comes exclusively with Windows 2016, says Transforming Network Infrastructure’s Laura Stotler. This allows the enterprise to automate configuration, management and other tasks for both physical and virtual networks so that virtual machines can communicate among themselves to gain a high degree of autonomy within the constraints of a user-defined control plane. There are, of course, other ways to accomplish this, such as Fiber Mountain’s AllPath Director on the Glass Core virtual fabric. While this provides for a more open approach to the SDDC under the OpenStack framework, thus avoiding single-vendor lock-in, it also has the potential to introduce more complexity to the full management stack.

    In the end, there is not likely to be a right or wrong answer when it comes to the SDDC. One of the advantages of a fully abstracted data environment is that the enterprise can create services and support mechanisms that no one else in the world has, and this makes it very difficult for rivals to encroach upon an established business model.

    Getting to that point, however, will require a great deal of ingenuity and innovation, not only in the development of the actual services but in creating the software-defined infrastructure that makes them unique.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles