VMware predicted software-defined data centers (SDDC) would “hit it big” in 2013. Spoiler: That didn’t happen.
Nonetheless, the concept hasn’t gone away. In fact, IT Business Edge’s Infrastructure blogger, Arthur Cole, wrote about SDCCs several times this year, including a November article in which he called the idea “a work in progress.” He did a great job of summing up SDCCs and the current opinion of them.
Still, it begs the question: Could 2015 be the year that SDCCs actually, finally, take off? Michael Hay thinks so.
Hay is the vice president of Product Planning at Hitachi Data Systems and chief engineer for the Information Technology Platform Division (ITPD). In a recent Information Week column, Hay predicted that SDCCs will be one of three disruptive trends in the coming year.
“There is a lot of interest in all things software-defined, especially the software-defined data center (SDDC),” he wrote. “I believe in 2015 the market will move away from trying to define what constitutes an SDDC and toward tangible business goals made possible by SDDCs.”
That made me wonder: What do SDDCs mean for the data, in terms of usability, access and analytics?
First, it would virtualize assets, which would make it easier for application developers to build for the business needs, Hay explained. He described an SDDC that uses RESTful APIs and application service catalogs and couples that with converged and hyper-converged systems. That’s a lot of buzz words, so I’ll sum up: It’d be service-enabled, so developers could access the assets via services without having to bother the network folks or DBAs. The converged and hyper-converged systems are basically a tightly integrated, modular system, which would further simplify development work. (This TechRepublic article offers a detailed explanation on converged versus hyper-converged systems, if you’re really curious.)
“The real win here is a return on productivity, which is made possible by virtualized assets that shield application developers in business teams from the details of the underlying platforms,” Hay wrote.
This actually makes a lot of sense when you consider the trend to embed analytics into business applications. The goal here is to put analytics where the user already works and at the point of decision-making.
Symantec built the world’s largest SDDC to run its enterprise, according to Senior Director of Product Marketing Drew Meyer. In a Forbes article, he revealed that information management is one of two large challenges with SDCC:
“This task of managing the ‘data about the data’ is a new layer of complexity in exchange for the tremendous freedom of an SDDC. Delivering the right resources to the right users in the right way requires a grasp on information management.”
The other challenge is achieving agility. SDCCs must deliver the right resources, in the right way, to the right users. Those three elements are the keys to SDCC success, he added, but it’s not easy.
“While the SDDC may scale faster, corresponding and organizing demands of the virtual elements can sprawl out of control,” he wrote.
Loraine Lawson is a veteran technology reporter and blogger. She currently writes the Integration blog for IT Business Edge, which covers all aspects of integration technology, including data governance and best practices. She has also covered IT/Business Alignment and IT Security for IT Business Edge. Before becoming a freelance writer, Lawson worked at TechRepublic as a site editor and writer, covering mobile, IT management, IT security and other technology trends. Previously, she was a webmaster at the Kentucky Transportation Cabinet and a newspaper journalist. Follow Lawson at Google+ and on Twitter.