More

    The SDDC: Still a Work in Progress

    Slide Show

    Bandwidth and Network Speeds Exploding with Hyperscale Deployments

    It’s funny how technology always progresses to a higher state even before the current state has made its way to widespread use. First blade servers, then virtualization and then the cloud all made their way into the collective IT consciousness while most enterprise managers were still getting their feet wet with the current “state of the art” technology.

    These days, the buzz is all about the software-defined data center (SDDC), which is an amalgam of nearly everything that has happened to IT over the past decade cobbled together into one glorious, trouble-free computing environment. And if you believe that last part, I have a bridge to sell you.

    What is clear is that by virtualizing the three pillars of data infrastructure – compute, storage and networking – entire data environments could potentially be created and dismissed at whim. I say “potentially” because the technology to do this simply does not exist yet, at least not in the way that people expect: quickly, easily and with little or no training.

    According to Peter Nikoletatos, CIO at Australian National University, the SDDC will unleash a wave of innovation that will finally bring manageability to today’s unmanageable infrastructure. At the same time, it will allow IT to become more agile in response to rapidly changing data and market conditions and to more easily align resource consumption, and therefore costs, with data loads. The key, of course, is advanced automation stacks that can more easily make changes to abstract software architectures than the physical infrastructure that currently drives the data process.

    This sounds terrific at the outset, but as Datalink’s Kent Christensen warns, this is merely a vision of what the SDDC will look like – the actual product could be very different. The fact is, most early adopters of SDDC are the hyper-scale providers like Google and Amazon, who are using their own home-grown constructs, and their requirements are different from the run-of-the-mill enterprise. At best, most organizations are laying the groundwork for the SDDC through advanced virtualization and policy-based workload management, but the higher-layer automation and orchestration needed to complete the SDDC is still a work in progress.

    The very real problem of layering this kind of automation on top of the wildly divergent collection of hardware that populates most data centers is still there, says tech journalist Simon Bisson. To start with, you’ll probably have to standardize your disk drives, solid-state storage and storage appliances, then move on to a common blade server and rack configuration, and finally use a coherent network architecture that bridges fiber, Ethernet and software-defined networking (SDN) constructs. Again, the hyperscale crowd doesn’t need to worry too much about this because they’ve already standardized their own infrastructure with ODM hardware, but the rest of the industry still has quite a bit of work to do.

    It is also impossible to define the SDDC without taking the cloud into account, according to Cirba co-founder Andrew Hillier. The three key processes required for an effective cloud are self-service access, resource monitoring/tracking and automation. At best, today’s cloud management systems can achieve a moderately granular level of resource provisioning and perhaps some rudimentary automation. So a typical self-service portal can find resources, but they are not necessarily optimal for the application at hand. A fully software-based cloud control plane, however, would overcome this disconnect, and it is here that software-defined platforms can have the greatest impact. In short, we should not be talking about the software-defined data center at all, but the software-defined cloud.

    It seems clear from all this that the software-defined data environment is not something that will be switched on like a lamp. Rather, it will take quite a bit of trial and error, probably with more emphasis on the error, and in the end a wide variety of solutions will give purists plenty of fodder in their arguments as to what qualifies as software defined.

    In the meantime, most enterprises would be wise to launch serious efforts to bring virtualization to their server, storage and network infrastructure in order to prepare themselves for full software-based functionality. Many of the key details have yet to be worked out, but if you wait until the vision is entirely clear, you’ll find yourself struggling to keep up with a technology that has already passed you by.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles