It wasn't too long ago that the trade press was awash in stories about virtualization. The technology remains a vital component of data center infrastructure, but let's face it, the bloom is definitely off the rose.
With attention shifting toward the cloud, virtualization has made the leap from the latest and greatest whiz-bang technology to just another hum-drum piece of the data puzzle — about as exciting as a new server or Ethernet router.
According to DynamicOps' Rich Bourdeau, blame the consumerization of IT for virtualization's loss of status. As enterprise users become accustomed to getting what they want when they want it on their iPhones, even virtualization's vastly improved resource allocation and provisioning process seems archaic. Virtualization may provide the dynamic infrastructure the enables the private cloud and all the IT-as-a-service advantages that go with it, but the actual user interface now resides in the cloud, moving it to center stage in the rapidly evolving data environment.
This isn't to say that virtualization as a technology is about to stagnate. On the contrary, development continues in the quest to make virtual environments more efficient and powerful as enterprises seek to accommodate ever-increasing data loads with smaller physical footprints. At the moment, however, most developers seem to be at a crossroads regarding what to do next with their virtual environments, according to InformationWeek. On the one hand, there is a need to standardize virtual platforms across business units, but not at the expense of losing the ability to devise custom solutions. As well, there is a need to maintain operational control even as managers crave the convenience and cost savings of outsourced infrastructure. And as always there is the delicate dance of driving virtualization as deep as possible into the data center without placing mission-critical applications and services at risk.
So it's not virtual technology per se that is drawing the interest of developers these days as much as it is management of virtual environments. Paragon Software, for example, has hit upon the idea that managing image files as opposed to virtual machines opens new doors to data flexibility. The company's Virtualization Manager 12 partitions to files that can be loaded onto virtual machines from any vendor at a moment's notice. A primary application is backup and recovery, allowing backed up OS partitions to run in any Windows, OS X or Linux environment, even if it is different from the image it was created in.
In this light, virtualization most definitely has a few tricks up its sleeve, but it seems that the major surge in data performance and flexibility is over, at least in a developmental sense. Many enterprises have yet to flesh out their virtual infrastructure, so are only just starting to tap into the many wonders that the trade press has been crowing about for the past decade.
But unless something dramatic leaps off the drawing boards at VMware or Microsoft, virtualization isn't likely to produce any stunning headlines in the foreseeable future.