The data center industry has focused a large part of its attention on heating and cooling issues for the better part of a decade. This is only natural in times of economic uncertainty when the ability to meet unexpected challenges, like a sudden spike in fuel prices, is often all that separates winners from losers in the free market.
However, it has been a primary school of thought that once the "low-hanging fruit" of energy conservation has been implemented - things like virtualization, hot/cold aisle designs and high-efficiency HVAC equipment - any further initiatives would produce ever diminishing returns.
While that may still turn out to be the case - the numbers just aren't in yet - signs are encouraging that the next generation of green technologies will at least produce near-equivalent results, and may even prove to be more effective.
Take the idea of "free cooling." Piping ambient air into the data center in cooler climates is certainly a low-cost option, but it isn't expected to produce overly dramatic results in all but the coldest regions. However, a Montreal company called SM Group International says it can cut costs in half, even in warmer locations, by designing data centers with pressurized equipment rooms. This allows pre-cooled air to blow directly over servers and other gear to prevent heat buildup in the first place, a trick that fosters increased utilization and, thus, lower hardware requirements, which enhances energy conservation even further.
Even without such a redesign, it turns out that data equipment is not as adverse to high operating temperatures as once thought. The University of Toronto recently found out in a study on data equipment failures that as temperatures go up error rates increase linearly rather than exponentially, at least until about 122 degrees F (50C). Yes, an increase is still an increase, but the risk of running at higher temperatures is much lower than expected, which may be part of the reason Google, Microsoft and others are comfortable with room temperatures as high as 80 degrees F.
Other techniques are producing equally dramatic results. Green Revolution Cooling reports that its CarnoJet submersive cooling system can cut power usage some 40 percent in rack systems containing upwards of 40 servers and more than 40,000 cores. Key to the system is a specialized liquid mineral dielectric called GreenDef that provides improved cooling and requires less energy itself than comparable air-cooled approaches.
And if you're in a position to retrofit or build from scratch, you may want to ditch the whole raised-floor design that has been a staple of server rooms for the past several decades. APC Schneider reports in a recent white paper that overhead cabling is the way to go because it improves air-flow in the plenum and requires fewer holes in floor panels where cool air can escape. The company says a properly designed room can cut consumption by nearly a quarter.
Through a combination of better heat-tolerating hardware and more efficient cooling systems, data centers could be on the verge of another dramatic reduction in operating costs.