Almost from the very beginning of the modern virtualization movement, technology futurists wondered what it would be like to have a completely virtualized data center. What would be the benefits, and the major challenges, to building entire compute/storage/networking infrastructure complete in logic?
Those questions are about to be answered now that the IT industry is taking seriously the idea of the software-defined data center (SDDC). In fact, the concept is now openly discussed as the next major segment within the increasingly diversified enterprise infrastructure market.
MarketsandMarkets, for example, recently released its latest SDDC survey, claiming the technology will produce $5.41 billion in revenues by 2018. The trend is drawing vendors from a diverse set of backgrounds, such as big-name platform providers like VMware, HP and Cisco as well as more specialized players like BMC Software and optical networking developer Calient Technologies. And since we are talking about full data center functionality within a virtualized framework, technology contributions range from virtualized IT components to cloud services, advanced management platforms and telecommunications infrastructure.
For an industry that has had its head spun repeatedly over the past decade with every new virtual development, there is some comfort in knowing that SDDC is likely to be the culmination of this revolutionary change. But still, this new state of affairs will take some getting used to. As Government Computer News’ Chris LaPoint puts it, managing IT networks is not like herding cats anymore, “it’s like herding cats that are being chased by dogs that just escaped from the kennel.” His list of key capabilities for SDDC functionality includes increased visibility across technology platforms, workload mobility management, storage integration and automation stability for virtualized applications.
The few showcase SDDC deployments to date, however, have shown some dramatic improvements in both data flexibility and cost reduction. Choice Hotels in Rockville, Maryland, for example, already had a fairly sophisticated virtual/cloud infrastructure in place, which actually resulted in its legacy data center becoming the weak link in an otherwise dynamic data environment. The company set up an SDDC in a collocated facility and ran it in parallel with legacy infrastructure for a few months, but has now fully migrated over the to the virtual environment and is now able to support critical functions like booking and property management on software-defined architecture.
Already, though, enterprises are more software-defined than they may realize. As Red Hat’s Hervé Marcy notes, anyone who uses operating systems, hypervisors, application servers and applications themselves is already defining operational capabilities through software. The missing link is a sophisticated platform that can integrate these elements into a single, manageable whole. And, this being Red Hat, the best way to do that is to implement an open platform that can circumvent all the proprietary fiefdoms that exist in most data centers today. Once these artificial barriers come down, the enterprise will be able to mix and match crucial components to build the broadly flexible data environments needed to accommodate the dynamic, synchronized workload requirements of the (very near) future.
The next two years or so are shaping up to be the watershed in the new era of data infrastructure. As the early adopters put the finishing touches on their software-defined networks, we should finally see the first fruits of the transition from static, silo-based infrastructure to end-to-end SDDC. It will probably take a decade or more for the entire industry to make the change, but once it’s done, today’s data center may seem as quaint to future IT professionals as rotary telephones are to the mobile generation.