Virtualization Party: Over Imbibing Can Create Costly Consequences

Ann All

Virtualization is a lot like drinking at a college keg party. At first, you hang back, exercise caution and see how you handle it. If you enjoy positive effects (easier to talk to people, makes you a better -- or at least less inhibited -- dancer), you go back for more. Your concern about downsides fades with every drink. You may end up stumbling into the bandstand in front of a packed dance floor or making a similar beer-addled move. If you're unlucky, you'll lose your wallet, your keys or your lunch. If you're really unlucky, you could lose every shred of self respect you have.

 

The point is, there are consequences. It's a similar story with virtualization.

 

In an interview with Forbes' JargonSpy Dan Woods, Jay Litkey, president and CEO of Embotics, a provider of software for managing virtualized data centers, said virtualization's fflexibility has led companies to over imbibe, creating virtualized assets with little concern over monitoring or managing them.

 

Every virtual machine (VM) carries costs, including space on storage area networks, CPU, memory, power, heat and cooling, software running inside the VM, possible licensing and support costs for the operating system, applications and software systems management add-ons. In addition, people must maintain, support and perform backups on VMs. How about some hard numbers? Litkey said his company finds overspending of some kind in more than 95 percent of its clients' environments. In a "fairly representative" example, it recently identified more than $200,000 in overspending and inefficiencies in an environment containing approximately 325 VMs.

 

This is likely to become an even bigger deal because, as IT Business Edge's Art Cole recently pointed out, some companies are moving into more sophisticated uses of virtualization. Whereas before the focus was on consolidating hardware and reaping savings in both capital and operational budgets, now companies are interested in leveraging virtualization for enhanced availability, disaster recovery and other key functions. He wrote:

That means attention is likely to shift from establishing and provisioning virtual resources to setting up the proper management procedures and controls to govern them. As things move along, greater attention will have to be paid to things like uniform build images, configuration discovery and change approval and tracking.

To achieve the ultimate promise of virtualization, he wrote, companies will move to a cloud environment, where "provisioning and other decisions are made not based on what resources are available, but on who has the most appropriate and cost-effective solutions."

 

In the meantime, Litkey told Woods, companies may benefit from services-management frameworks like ITIL and COBIT, which can be used to determine the level of process maturity in current data center environments. After a baseline is established, companies can begin tweaking their existing processes and control systems to consider virtualization and its characteristics. Said Litkey:

Virtualization brings new states and new change actions like copy and snapshot and migrate. These are new dynamics, or new physics if you will, for the data center that did not exist before and need to have elements of control and visibility wrapped around them.


Add Comment      Leave a comment on this blog post

Post a comment

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

null
null

 

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.