It’s a given that the path to the cloud runs through virtualization. Some would argue that virtualization is not an absolute necessity for a flexible, dynamic data infrastructure, but without it the challenges in getting on the cloud are so great that the costs outweigh the benefits.
With that in mind, and given the fact that modern server virtualization is well into its second decade in the enterprise, you would think that the basic virtual elements are already in place in most data centers. But you would be wrong.
According to a recent survey from QuinStreet Enterprise, less than half (43 percent to be exact) of organizations have deployed server virtualization, and 17 percent indicated that they have no plans to utilize the technology at all in the coming year. This is surprising considering that virtualization has done wonders to simplify traditional data center infrastructure, primarily by streamlining hardware footprints and improving resource utilization. Oddly, though, the top reason cited by organizations that have implemented virtualization is security, which by some accounts can be a challenge when dealing with rapidly shifting virtual machine environments.
Still, it seems that those who are on board with virtualization are taking to it in a big way. According to AMI-Partners, virtual server adoption is set to grow more than 18 percent in the coming year, with the SMB market alone already pushing past the $1.2 billion mark. And due to the technology’s ease of use and deployment flexibility, VMware is expecting VM numbers to start growing at nearly twice the rate of hardware servers going forward.
This is part of the reason companies like Red Hat continue to pour investment dollars in plain old virtualization, even as they push exotic cloud architectures to more advanced customers, says Server Watch’s Paul Rubens. The new RHEL 6.5 includes greater compatibility with Hyper-V and Microsoft platforms in general, making it easier to integrate into many enterprises’ legacy environments. As well, a new CPU hot-plugging/unplugging feature allows CPUs to be enabled or disabled while the guest is running, plus an expanded KVM virtual memory feature allows a single guest to scale up to 4TB.
Still, too many organizations view virtual infrastructure simply as a means to provide plain old IT in a less expensive and more efficient manner when in reality it lays the groundwork for an entirely new data ecosystem, says MongoDB’s Matt Asay. For one thing, virtualization provides the bridge between traditional infrastructure and the growing cloud universe and all the service-based advantages it entails. Despite all the hype that the cloud has engendered over the past few years, security and availability issues still give many CIOs the chills, so it is incumbent on virtual platforms to help the enterprise retain a sense of the familiar as it embraces the new cloud paradigm.
All of this will be for naught, however, if IT executives fail to see the need for virtualization in the first place, which is still a challenge for more than half the industry. Of course, every community will have its Luddites, even within groups that take pride in their embrace of advanced technologies.
So before we start dreaming of all the magical things that can happen on virtual and cloud platforms, we should recognize that a large portion of the installed IT base is still not ready for even the most rudimentary virtual functions.