The SDDC in 2016? Probably Not

    Slide Show

    Defining Deep Data: What It Is and How to Use It

    Amid all the buzz surrounding the software-defined data center (SDDC), is it reasonable to assume that 2016 will be its breakout year?

    In a way, the answer is yes, but not if your definition of “breakout” means the beginning of widespread commercial deployment. Rather, the advances in the coming year will likely be confined to the lab, with perhaps a few key pieces of the management and automation stack hitting the channel, but not a cohesive data stack ready to take on full production environments.

    The SDDC will certainly come into focus in 2016, says Information Age’s Ben Rossi, so much so that the enterprise will finally be able to view it as a real technology initiative rather than a loose concept. Only the largest, most forward-looking enterprises are in a position to implement anything close to a full SDDC, however, and even then the term is open to such wide interpretation that it will be difficult to determine who has the real thing and who is just posing. But at the very least, we will see showcase deployments that will put the technology’s capabilities and its benefits to the business process in plain sight, particularly as it relates to mobile computing, the Internet of Things and advanced cloud architectures.

    Indeed, unless you are on the cutting edge of the data economy, it’s probably wise to not jump into the SDDC just yet, says Gartner analyst Dave Russell. The entire field is still very immature, and few organizations have contemplated the business cases, best-use aspects and risks of going all-software. Remember, this is not merely a change of technology, but a complete cultural transformation of the enterprise, encompassing workflows, business processes, skill sets and the entire corporate culture. A five-year window for SDDC is probably more appropriate for most organizations, with the understanding that failure will likely be a regular facet of the process, so it’s best to work with it rather than fight it.

    Still another factor working against the immediate rise of the SDDC is the perception that it must be implemented at once, or at least in parallel to legacy infrastructure, which is likely to be costly and cause a lot of confusion. A recent survey by Infosecurity Magazine had 55 percent of the poll saying they had no plans to implement SDDC at all, which analyst Peter Bury says is the result of the technology’s reputation as a zero-sum game. The reality, though, is that most organizations will convert to all-software gradually, led primarily by business factors that stress low costs, agile service delivery and rapid turnaround.

    In all likelihood, says CloudWedge’s John Hawkins, the SDDC will probably emerge as a by-product of two existing trends: data federation and service-based infrastructure. With infrastructure available on the cloud capable of supporting virtual resources of all kinds, the desire to push processing closer to the edge will mount. This will lead to the rise of the micro data center, which will likely evolve as an all-software construct that can be placed closer to consumers than centralized hubs, and therefore provide the kind of high-speed, location-based data activity expected of Big Data operations. These kinds of architectures should also improve reliability, lower costs, and make more efficient use of resources.

    The SDDC is certainly on the radar as we close out 2015, and the signal will only get stronger as 2016 unfolds. But that’s a far cry from witnessing a full-bore production deployment.

    The IT industry first needs to digest the real implications of emerging data initiatives like Big Data and the Internet of Things before committing to an all-software architecture. And even then, it will probably take a few years before anything close to an end-to-end SDDC takes shape.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Latest Articles