More

    The SDDC Ready to Step Up

    Slide Show

    To Converge or to Hyperconverge: Why Not Both?

    The software-defined data center (SDDC) represents the most radical shift in enterprise infrastructure to date. As a fully service-based construct, the SDDC not only lends itself to streamlined infrastructure and data architectures, but an entirely new management paradigm that pushes tasks like configuration and provisioning onto development teams, or even the applications themselves.

    But there are pitfalls here, as well. Software, of course, can be buggy and is prone to security breaches and other dangers that could bring operations to a halt. So even though the pressure is on to shed traditional infrastructure, the enterprise should tread carefully here, at least until the technology has gained some maturity.

    That probably won’t take long, however. According to Allied Market Research, the global SDDC market is expected to approach $140 billion by 2022, representing an average annual growth rate of 32 percent. This is driven largely by the desire to craft hybrid cloud models that can adapt quickly to emerging data patterns and lower the overall cost of highly scaled data environments. Naturally, it is much easier to do this on a services model than a traditional fixed hardware architecture, although integration, security and other issues pose a perennial challenge, even in highly automated settings.

    The SDDC also lends itself well to another major trend sweeping the enterprise: hyperconvergence. As Bitdefender’s Liviu Arsene noted on InformationWeek recently, the hardware commoditization that underpins hyperconverged infrastructure makes abstraction of high-level data architectures all but inevitable. Now that networking has joined compute and storage as a virtualized resource, there is nothing to prevent entire data stacks to be defined in software, which allows resources to be provisioned as services rather than fixed assets. Top cloud providers are already steeped in this model, which means the enterprise will need to do the same if it hopes to match the service levels that the cloud can deliver to the knowledge workforce.

    Indeed, says VMware in a guest blog on MSPmentor, the SDDC only became a practical reality in the past few years as it achieved several key capabilities. First, it had to seamlessly support traditional and cloud-native applications using standard x86 storage and IP networking. As well, it had to support the virtual infrastructure that most organizations had developed over the previous decade or more, and it had to enable self-service access and robust security for high-scale, dynamically burstable workloads. With these capabilities in place, the SDDC is now ready to take on the rigors of modern production workloads.

    Still, in any SDDC development program, it is important to stay focused on the services in need of support and the user outcomes to be achieved, rather than the technology being deployed, says NetApp’s Michael Elliott. The SDDC can be architected in a virtually unlimited number of ways, and indeed its design will undoubtedly evolve over time as data requirements change and intelligent management systems take on more of the day-to-day operational requirements. Using advanced fabric layers and highly federated resource sets, the SDDC offers the means to extend holistic support to emerging applications – that is, administrators and/or automated systems will be able to craft optimal run environments for individual use cases instead of trying to force-fit every function into the confines of static infrastructure.

    If the advantages of software-defined infrastructure over hardware-defined infrastructure can be summed up in one word, it would be flexibility. There will undoubtedly be mistakes and false starts in the drive to craft a working abstract infrastructure, but in virtually every case, corrective measures are easier to implement in code than through a retrofit.

    And ultimately, we could see the rise of autonomous, self-correcting data environments that require very little human involvement at all – except to tell it what you are trying to accomplish.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles