Cooling a More Mature Green Data Center

Arthur Cole

The concept of "going green" in the data center may have jumped the shark, as the saying goes, but only in the sense that adopting energy-efficient technologies and practices is no longer about catch phrases and empty promises.


Instead, it appears that the movement is transcending the novelty phase and has settled in as a permanent component of the upgrade and expansion process. That means you're less likely to see green technologies put in place for their PR factor as much as for their impact on the bottom line.


Nowhere is this more apparent than in cooling systems. Where once the standard practice was to build out massive cooling structures regardless of their consumption, the newest designs are all about keeping systems cool without busting the budget.


For example, Google recently powered up a chillerless data center in Belgium, foregoing the massive refrigerators that typically inhabit large facilities in favor of naturally cool water from a nearby industrial canal. On the few days of the year when it gets too hot, the company plans to shut down systems and shift loads to other centers.


Air-cooling systems are also getting a makeover. APC just launched a series of row-based cooling systems that not only use less power than traditional designs, but are more scalable and can be installed in greater densities. The system focuses cool air directly onto rows of servers, allowing particular rows to receive additional cooling if they are running high-density applications. A single 600 mm row provides up to 7 kW of cooling capacity and comes equipped with monitoring and automation features to dynamically adjust capacity to maintain constant temperatures at the server inlets.


Another movement afoot in energy-efficient circles is coordination with power and cooling experts at the systems integration stage. Sun Microsystems recently teamed up with Emerson Network Power to offer customized solutions for individual facilities. Emerson maintains power and cooling specialist teams throughout the world capable of devising plans and initiating specific products and services designed to improve data center efficiency. One of their first customers is Sandia National Laboratories, which recently received a new series of Sun Blade X6257 modules and the Sun Cooling Door system tied to Emerson's Liebert XD cooling platform.


When it comes to keeping things cool, the twin dangers are doing too little and doing too much, according to Amazon engineer James Hamilton. On the too little side, he lists failing to seal air flowing into and out of the rack, while the too-much crowd includes enterprises that tend to over-cool their rooms. The American Society of Heating, Refrigeration and Air Conditioning Engineers (ASHRAE) recommends 81 degrees F for today's servers, with an allowable range that extends to 90 degrees. If you look hard enough, you might even find some equipment that's rated for over 100 degrees.


Despite the advances that have taken place in cooling systems of late, the fact is that energy efficiency remains the second most important factor in any redesign. The primary consideration should be reliability. All the energy savings in the world won't amount to a hill of beans if the system fails outright or fails to maintain a proper working temperature.


The good news is that these two requirements are not at odds anymore. Greater efficiency is working hand-in-hand with greater reliability, which means you still maintain the same productivity you had before -- you just pay less for it over time.



Add Comment      Leave a comment on this blog post
Aug 26, 2009 11:14 AM John John  says:

Art, we think that in mature datacenters, as your article refers to them, liquid cooled systems added into the datacenter may be the way forward.  Based on some research IBM has done, it appears that removing excess heat from data centers is as much as 4000 times more efficient via water than air.  My firm led the thermal engineering track at ATCA Summit in 2008 and we had a number of guest speakers and panelists there from Emerson, AT&T, Lucent, Schroff and others.  One point came out loud and clear; keep water as far away from the CPU as possible. But, getting it closer will be better. So, now, as an industry, we have to hone the current generation of water based cooling solutions.  Emerson and IBM both have retrofits for bringing water up to the cabinet level for example.

Based on customer feedback about water, we've been looking a bit at C02.  It has seven times the cooling capacity water per Kg, poses no danger to electrical equipment, no ozone depleting potential, and has low toxicity.  Another plus?  The carbon footprint of IT Equipment rooms can be reduced 50% to 70%.

Reply
Nov 26, 2010 5:50 AM HVAC Charleston HVAC Charleston  says:

Nowadays, the need for cooling and/or heating systems has become flexible and it can now required for various purposes. Whether you are cooling your home, office of data center, the important thing that you should remember is proper maintenance so that you will be able to work regularly and you can save energy costs as it runs in good condition all throughout.

Reply

Post a comment

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

null
null

 

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.