It's getting harder and harder to keep track of everything that's happening with virtualization these days. Like the proverbial octopus, its tentacles are reaching into just about every corner of the enterprise.
This week, we saw IBM offer up a new tape virtualization engine for its TotalStorage system, while Sun keeps up the pressure with a virtualization strategy for everything from the x86 and SPARC servers to new Solaris containers to help distribute Xen and VMware more efficiently.
Software benefits are coming in leaps and bounds as well, with Parallels close to releasing virtualization packages to put Vista on the Mac. Even the mobile community is getting into the act, talking up the prospect of application virtualization extending the lifecycle of wireless devices.
And who could possibly argue against a technology that offers significant server consolidation in the data center, power reductions in the realm of 40 percent, greater productivity on the desktop, and a substantially longer shelf-life for all manner of hardware and software? Our only trepidation at this point is the fact that we've been over-promised before. It seems to be an unfortunate pattern among manufacturers and developers across all industries: Develop a product that truly offers the potential of tremendous benefits, then market the heck out of it, even if the actual product won't deliver for another decade.
There's also a long history of technological developments solving one set of problems, only to spawn an entirely new set.
Is this happening with virtualization? It's probably too early to say. There have certainly been enough success stories to give it high marks. But since prudence is the better part of caution, the wisest course of action is to approach virtualization with a clear understanding of what it can and cannot do for your particular environment.