How to Approach Implementing Virtual Technology

Arthur Cole

If the latest research is correct, only about a quarter of worldwide data centers have embarked on some sort of virtualization strategy. That leaves the vast majority of you either still pondering the relative value of the technology or simply not in a position to rework your system architecture.

 

But that might not necessarily be a bad thing, since it's usually the early adopters who have the most trouble getting new technology to live up to the hype.

 

And when it comes to virtualization, we've seen plenty of hype (myself included -- I've been touting virtualization's benefits for the past three years now). Which is why it's refreshing to see a report like this one from Forrester that spells out very clearly which virtual technologies are worthwhile right now, and which ones still need a bit of work. (The skinny: Server and client virtualization top the list of techniques with the most short-term promise, while application storage should probably be left on the back burner for now.)

 

But once the decision to go virtual has been made, are there any good strategies to ensure that you get the maximum return on your investment? In general, yes -- but since every enterprise is unique you'll no doubt be blazing your own trail at some point.

 

CiRBA CTO Andrew Hillier says one of the things to keep in mind is the kinds of applications that will rely on the virtual infrastructure. He says different applications process different types of workloads, which can have an impact on the overall performance. There is also hardware/software compatibility to take into account, storage requirements, I/O infrastructure issues and a range of other issues and relationships that can make or break a virtual environment.


 

There also needs to be a shift in thinking, says eWEEK's Michael Vizard in his Masked Intentions blog. Too often, the decision to go virtual is made for tactical reasons: consolidation, increased utilization, cost containment. A much better approach is to think of it as a strategic investment intended to devise a more modular architecture that separates the traditional link between hardware and software. In this way, you can better analyze what types of services your IT network should provide and then implement those services in a more orderly fashion.

 

That kind of vision may be easier said than done. According to Mainframe Executive's Jon William Toigo, the vast majority of virtualized servers are those running low-performance/low-utilization operations, like file and print services or low-traffic Web services, rather than mission-critical database and e-mail functions. Part of the reason is that even when virtualized, x86 platforms have a tough time providing the resources needed for high-performance applications.

 

Despite these issues, and at the risk of serving up a little more hype, it seems likely that virtualization is merely going through the normal growing pains that any disruptive technology experiences on the way to creating a new paradigm. Virtualization offers nothing less than a chance to rework stagnant enterprise topologies into a more flexible, responsive and, yes, cheaper environment that won't just enhance 21st Century data needs, but will be vital to them.



Add Comment      Leave a comment on this blog post

Post a comment

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

 

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.


 
Resource centers

Business Intelligence

Business performance information for strategic and operational decision-making

SOA

SOA uses interoperable services grouped around business processes to ease data integration

Data Warehousing

Data warehousing helps companies make sense of their operational data


Thanks for your registration, follow us on our social networks to keep up-to-date