It seems that all the virtual pieces are finally in place and the enterprise is poised to embark on an unprecedented journey into data performance and flexibility.
The arrival of software-defined networking has heralded the drive to the fully software-defined data center (SDDC), in which all physical aspects of the data environment—servers, storage, networking and the host of specialty appliances on the market—can be created, provisioned and decommissioned entirely via software. It is essentially the difference between data users’ behavior conforming to the dictates of infrastructure and the infrastructure conforming to the needs of users.
But just because we can now envision such a scenario, does not mean getting there will be easy, or cheap. A host of issues must be confronted—everything from systems and data migration to policy development and resource allocation—in order to bring the SDDC from the lab to the real world.
The latest market data seems to indicate that, potential problems aside, the IT industry is ready to give SDDC a try. Reportstack estimates the sector will see growth at a CAGR 97.48 percent between now and 2018, driven primarily by the need to harness virtual and cloud environments to meet the growth of structured and unstructured data. At the same time, Research and Markets predicts that SDDC will accomplish no less than the complete transformation of IT and the data center as we know it, with virtually every organization on the planet embracing software-based infrastructure to optimize service and application delivery across disparate data environments.
With most enterprises already heavily invested in virtualization, it would be almost a crime not to extend those benefits to storage and network infrastructure, says Mike Koehler, senior vice president of EMC’s Global Services unit. The key, though, is a unified management and orchestration stack that allows the entire infrastructure to be manipulated as a single entity. The enterprise should aim for nothing less than the delivery of IT as discrete sets of on-demand services that can be mixed and matched to suit the unique needs of end users and the applications they require to complete their tasks.
This may be the future of IT, but the fact remains that most legacy infrastructure consists of wildly disparate hardware and software platforms, and getting this mishmash to behave as an integrated environment will be no easy task. But a company called SOA Software says it may have a way to bridge those disparate architectures in the form of a new API platform that works around vendor-specific technologies, even if they don’t support open platforms like OpenFlow and OpenStack. The platform basically provides a uniform API management layer that enables secure, policy-driven connections between applications and infrastructure, along with message protocol transformation and vendor-neutral connectivity that fosters many-to-many resource integration. In this way, applications can be programmed with everything they need to compile the resource environment that best suits the task at hand.
At the very least, the SDDC movement provides a clear vision of what the virtual/cloud end game will look like. Data users will no longer worry about not having enough throughput or storage capacity to handle that rush batch job, while enterprises will be able to more closely match resource allocation and operational costs with actual data loads, rather than over-provisioning to guard against worst-case scenarios.
And in general, we should see more efficient, effective IT, even if the transition between today’s legacy infrastructure and tomorrow’s SDDC probably won’t go as smoothly as some of the technology’s most ardent supporters predict.