Data Center Cooling: New Leases on Life

Arthur Cole
Slide Show

Five Keys to Creating the Data Center of Tomorrow

The data center industry has focused a large part of its attention on heating and cooling issues for the better part of a decade. This is only natural in times of economic uncertainty when the ability to meet unexpected challenges, like a sudden spike in fuel prices, is often all that separates winners from losers in the free market.


However, it has been a primary school of thought that once the "low-hanging fruit" of energy conservation has been implemented - things like virtualization, hot/cold aisle designs and high-efficiency HVAC equipment - any further initiatives would produce ever diminishing returns.


While that may still turn out to be the case - the numbers just aren't in yet - signs are encouraging that the next generation of green technologies will at least produce near-equivalent results, and may even prove to be more effective.


Take the idea of "free cooling." Piping ambient air into the data center in cooler climates is certainly a low-cost option, but it isn't expected to produce overly dramatic results in all but the coldest regions. However, a Montreal company called SM Group International says it can cut costs in half, even in warmer locations, by designing data centers with pressurized equipment rooms. This allows pre-cooled air to blow directly over servers and other gear to prevent heat buildup in the first place, a trick that fosters increased utilization and, thus, lower hardware requirements, which enhances energy conservation even further.


Even without such a redesign, it turns out that data equipment is not as adverse to high operating temperatures as once thought. The University of Toronto recently found out in a study on data equipment failures that as temperatures go up error rates increase linearly rather than exponentially, at least until about 122 degrees F (50C). Yes, an increase is still an increase, but the risk of running at higher temperatures is much lower than expected, which may be part of the reason Google, Microsoft and others are comfortable with room temperatures as high as 80 degrees F.


Other techniques are producing equally dramatic results. Green Revolution Cooling reports that its CarnoJet submersive cooling system can cut power usage some 40 percent in rack systems containing upwards of 40 servers and more than 40,000 cores. Key to the system is a specialized liquid mineral dielectric called GreenDef that provides improved cooling and requires less energy itself than comparable air-cooled approaches.


And if you're in a position to retrofit or build from scratch, you may want to ditch the whole raised-floor design that has been a staple of server rooms for the past several decades. APC Schneider reports in a recent white paper that overhead cabling is the way to go because it improves air-flow in the plenum and requires fewer holes in floor panels where cool air can escape. The company says a properly designed room can cut consumption by nearly a quarter.



There will likely be a never-ending stream of technologies and design innovations aimed at improving efficiencies, reducing waste and cutting carbon footprints in the data center. The challenge is in figuring out which ones deliver real-world savings and which ones simply provide cosmetic change.


Through a combination of better heat-tolerating hardware and more efficient cooling systems, data centers could be on the verge of another dramatic reduction in operating costs.



Add Comment      Leave a comment on this blog post
Jun 1, 2012 3:59 AM Jack Bedell-Pearce Jack Bedell-Pearce  says:

Hi,

I thought it might be worth adding that there is a form of 'free cooling' which can be used in non-cool climates.  Despite the UKs reputation for being a mostly cloudy and rainy place, our data centre based in the South East of England quite often has ambient air temperature above 19 degrees centigrade (the temperature we like to deliver air to in our cold aisles).

During the winter we pre-mix the warm air generated by the servers with the very cold ambient air to deliver our target temperature (this is more complicated than it seems but is entirely achievable with the right controls).

During the summer, we use something called evaporative (or adiabatic) cooling.  It works on the simple principle that when water evaporates it draws energy and heat away with it. Just in the same way that people sweat during the summer to cool down, this technology draws warm ambient air through a wetted filter, which in turn causes some of the water to evaporate and cool the ambient air down.

During a very hot day in the UK (which from a cooling perspective is mercifully rare), a standard evaporative cooler will use approximately 100 litres per hour. This may sound like a lot but it takes 170 litres of water to produce one pint of beer while washing a car with a hose pipe can take up to 480 litres.

In energy terms, the industry figure for the carbon cost of water is around 0.298g per litre, while it takes 620g of carbon to produce 1KW of electricity-in other words in terms of carbon, 1KW of energy equals 2,080 litre of water.

We've calculated our annualised average PUE to be 1.14 which we believe makes our data centre one of the most energy efficient in the world (we even won an award recently beating Capgemini).

You can also find more information on our website here about our cooling system:

http://www.4d-dc.com/about-us/green-credentials/

Jack Bedell-Pearce-Commercial and Operations Director

4D Data Centres

Reply

Post a comment

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

null
null

 

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.