Seven Best Practices for Virtualization
Virtualization is taking IT to new horizons from which whole new sets of opportunities are coming into view.
Virtualization has been a godsend for the enterprise, particularly as it has struggled to keep costs down in the face of increasing data loads and rising energy costs. Still, there's no question that it has placed a heavy burden on physical infrastructure, which has had to shift its focus from single-handed processing and storage of data and applications to sharing the burden among disparate resources.
Significant changes are in the pipeline, however, as new approaches to data handling gain footholds on the most fundamental levels of IT infrastructure. Silicon, for one, has caught the virtual bug. After years of Moore's Law and the relentless pursuit of greater processing power, development has shifted toward more cooperative architectures that value data transfer and communication more than clock speed and circuitry.
Intel's new Xeon E5-2600 marks a watershed in this regard. The chip features an integrated I/O controller that supports the PCIe 3.0 interface that delivers 8 gigatransfers per second, bolstered by the Direct Data I/O architecture that moves data directly into processor cache rather than making a pit stop in main memory. The result is lower latency and less power consumption.
The device also sports Intel's Turbo Boost Technology and Node Manager power management, which work to shift power between cores and other components to more accurately reflect data loads. It can also tap into the Data Center Management stack that provides broad silicon-level visibility and dynamic load balancing across rack and cluster configurations.
All of this serves to boost the E5's performance some 80 percent over the existing 5600 platform, even while it cuts energy usage in half. Most users will see improved performance even in non-virtual environments, however the true gains will be seen in highly distributed architectures that rely on high-speed networking to ferry workloads across multiple resources.
Rival AMD, of course, is also working the virtualization angle, but has taken somewhat of a different tack. The company recently purchased server maker SeaMicro for $334 million with the idea of integrating its technology with next-generation Opterons into new fabric-based offerings for OEM customers. Initial reports describe these "building blocks" as card-sized devices outfitted with CPU, DRAM and customized ASICs designed to support highly virtualized environments.
These days, it is difficult to find an enterprise that has yet to implement virtualization on some level. That being said, few organizations have pushed their level of virtualization much above 30 percent or so, a nod to the fact that hardware still has its limits.
Pushing past those boundaries requires new thinking on the relationships between data, applications and the resources that support them. That means changes across the board, starting with the most basic components.