Twelve Virtualization Myths Debunked
Global Knowledge takes on some of the most common myths about virtualization.
The farther down the road enterprises travel toward virtualization and the cloud, the more it seems that hardware becomes less relevant.
It's an easy trap to fall into, considering most of the productivity gains of late have come from the virtual layer and above. Commodity servers, storage and networking devices rule the roost at most data centers, and the only pieces of hardware most users think about these days are their cell phones.
But does this mean hardware is out as far as the enterprise is concerned? At this year's Interop, VMware executives were busy touting the "software-defined network" as the next major trend in IT. As the company would have it, virtualization must extend beyond the server farm if the efficiency gains of the past decade are to continue. And in fact there is some truth to this, as the impact of numerous virtual machines on static networking environments is well documented.
HP is on the same track, having recently extended the OpenFlow protocol throughout its networking line. The thinking is that with an open-source framework in place, enterprises will be able to push virtualization and multi-tenancy across network infrastructures. This will have the twin benefits of making networks more efficient and more agile as they struggle to keep up with increasingly diverse applications loads.
Again, nothing wrong with the concept, but I wonder if the notion of all-things-software is being oversold. Does hardware still have a critical role to play in the virtualized, dynamic enterprise of the future? ZDnet's David Chernicoff asks some pointed questions in this direction, at least when it comes to unified virtualization models. Yes, it would be wonderful if managers could control everything through a single console, but how long will such a model last in the face of inevitable hardware upgrades? Most virtual platforms are optimized for a specific hardware environment, which means that in a unified model changes in the deployment of PCIe 3.0 must be made across the entire infrastructure all at once. That would leave you with a choice between an expensive forklift upgrade or gradual hardware obsolescence.
Given the vagaries of most enterprise hardware infrastructure, then, what is the best way to determine what kind of virtual architecture makes the best fit? One way is through new generations of resource analytics. Centrix Software, for example, recently updated its WorkSpace iQ and WorkSpace Designer for Citrix suites with new tools that help guide virtual deployments. The package works with Citrix' AppDNA Apptitude systems to delve into user patterns, resource consumption, data and traffic patterns and other metrics to optimize application and content delivery across virtual environments.
You'd be hard-pressed to find any expert in virtualization technology say that hardware simply doesn't matter anymore. But the continued focus on software-driven solutions for all things that ail the enterprise is producing a mindset in which hardware is clearly an afterthought.
If the enterprise truly is evolving into an organic entity capable of growth and change, then care must be taken to ensure the entire organism is well cared for. In higher-order beings, brains, blood and flesh are crucial to survival - but so are bones.