Waiting for the Next Step in Virtualization

Arthur Cole
Slide Show

The Tier 1 Application Virtualization Outlook

Mission-critical apps may finally be gaining ground.

Virtualization has been the dominant factor in data center development for the past decade. The addition of a virtual layer between physical and logical infrastructure has produced stunning achievements in data flexibility and operations efficiency.

But has the technology run its course? There has been much press about "virtual sprawl," the point at which enterprises feel they can no longer virtualize and consolidate server resources. The reasons given range from lack of storage support for newly virtualized environments to the fear of placing critical applications on virtual infrastructure.

However, this represents limited thinking when it comes to the true benefits to be derived from virtualization, according to IT analyst Barb Goldworm. She predicts that most organizations can increase their virtual footprints by 50 to 70 percent, tapping into not just server infrastructure but networking, storage and the desktop. At the same time, there is no reason why virtual systems could not support Tier 1 applications as well as traditional physical infrastructure.

And for those of you who want to get into some really mind-bending architecture, how about virtualizing your virtual machines? That's what IT consultant Greg Shields calls "nested virtualization," although he's quick to point out that it isn't used so much for general operations as for configuration testing, software evaluation and the like. vSphere 5.0, in fact, offers enhanced support for loading VMs into VMs to, say, check out a VMware Workstation instance inside an ESXi, and maybe throw in Hyper-V host for good measure.

Increased virtualization shouldn't lead to total virtualization, however. As Computerworld's Joab Jackson points out, some applications will only see optimal functionality on native hardware. These include apps that rely heavily on a particular resource, such as network or disk I/O, thereby limiting the resource-sharing benefits that virtualization is supposed to provide. As well, virtual desktops will only provide significant advantages over traditional desktop infrastructure if a single, generic image is shared across many users.

Still, sometimes virtualization is limited not due to the technology itself, but to the legacy environments it has to deal with. A case in point is the widespread use of agent-based software, according to tech consultant and author Elias Khnaser. Agents are fine in the physical world for functions like backup and anti-virus protection. However, once you go virtual, software agents lead to increased bottlenecks and needless duplication, ultimately reducing your ability to scale resources and foster greater levels of virtualization.

All of this points up the fact that virtualization is more than just a nifty new tool to improve data functionality. Rather, it is a fundamental change in the way data environments are designed, managed and implemented.

The longer the enterprise continues to view virtualization through the short-term lens of hardware consolidation and cost reduction, the longer it will take to position data infrastructures for the truly revolutionary capabilities the technology has to offer.

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.


Add Comment      Leave a comment on this blog post
Nov 22, 2011 11:54 AM Eric Hennessey Eric Hennessey  says:

Some good points raised here. There's frequently a reluctance to virtualize tier 1 applications mainly over concerns over availability. This was the driving market factor for our development of the Application HA product for VMware. We've since extended the the product to KVM, IBM LPAR and Solaris LDOM (Oracle VM) environments.

With its elastic nature and rapid provisioning, virtualization is a natural key element for private cloud architectures. But as you point out via Joab Jackson's piece, some workloads such as large back-end database servers simply run better on physical "big iron" servers. As services are moved to private cloud architectures, it's common to see some of those service's components running on virtual servers, but ultimately dependent upon a database server running on physical hardware. It's for this reason we added the concept of Virtual Business Services to our core Cluster Server product and our Application HA product. This allows multi-tier application stacks with components running on a mix of OS and virtualization platforms to be started, stopped and failed over in the proper sequence.

I suspect, though, that most production environments would shy away from "nested virtualization", although this could be an economical solution for test/dev and classroom solutions.

Eric Hennessey

Sr. Principal Technical Product Manager

Symantec Corp.


Post a comment





(Maximum characters: 1200). You have 1200 characters left.




Subscribe Daily Edge Newsletters

Sign up now and get the best business technology insights direct to your inbox.

Subscribe Daily Edge Newsletters

Sign up now and get the best business technology insights direct to your inbox.