Another year is upon us and so, it would seem, is another technology movement. Just as virtualization gave way to the cloud, so too is the cloud giving way to software-defined architectures — characterized by their ability to define and redefine themselves with little or no regard for the hardware that houses them.
But despite the breathless headlines and strong testimonials from vendors and their channel partners about the game-changing impact of this new technology, enterprise executives who are expected to pay for it must nevertheless ask the perennial question “Is this real?” This is particularly relevant now that the “software-defined” moniker is extending beyond its initial assignment in the network realm to include storage, the data center itself and, well, everything.
In part, this movement is driven by the desire to duplicate elsewhere in the data environment the same kind of efficiency and utilization results that virtualization brought to the server farm. As Information Age noted recently, the need to do more with less is still paramount in the enterprise, particularly as the demands of user mobility, Big Data and rich media content continue to mount. But does this necessarily mean that software-defined (fill in the blank) truly represents an advancement in data technology, or is it merely virtualization by another name?
When it comes to the storage farm, the change is real, according to Enterprise Strategy Group’s Mark Peters, even if the semantics are still a little fuzzy. The fact is that terms like “virtualization” and “software-defined x (SDx)” often refer to broad concepts, even though the true value to each data environment is found in the details. So if we’re talking about software-based intelligence that can do away with the manual processes involved in traditional hardware configuration, SDx is a fairly generic term. But once we get into issues surrounding its deployment, operation, feature sets, and the like, then our definition of what is and is not software-defined starts to falter, even as its utility to the enterprise may increase.https://o1.qnsr.com/log/p.gif?;n=203;c=204663295;s=11915;x=7936;f=201904081034270;u=j;z=TIMESTAMP;a=20410779;e=i
A case in point may be some of the recent criticisms being leveled at Nicira’s SDN system. The entire SDN movement, you’ll recall, can be traced back to VMware’s purchase of Nicira last fall, but as Nixu Software’s Juha Holkkola claims, Nicira’s approach falls short of true SDN because it utilizes IP addresses rather than the more convenient Domain Name System (DNS) to link virtual server instances. Granted, this is one software company calling out a rival, but the fact remains that disputes are already rising over whether or not SDN should have components like dynamic DNS provisioning and automation.
Is it really necessary, though, to parse these differences to the extreme, asks CohesiveFT’s Patrick Kerpan. Once we’ve agreed that the software-defined data center encompasses all crucial data center elements — computer, storage and networking, as well as things like application deployment and management — everything else is just petty quibbling over the details. And at this point, it may be too early to fully nail down what exactly is and is not SDx because the technology is still evolving. Even VMware has yet to reveal its full SDN portfolio, despite the Nicira purchase and its embrace of the OpenFlow protocol.
All true, and it points up to an even greater truth that I’ve tried to give voice to in the past: Labels don’t really matter at all. Terms like virtualization and SDx are usually coined by marketing people as a means to present their products and services to buyers. But what really matters to the enterprise is whether or not the technology at hand actually addresses the challenges they confront in their data environments.
Accomplishing that feat, however, will require a full understanding of what those challenges are, and then the courage to deploy a solution that works, no matter what the popular lexicon is at the moment.