In this day and age of data center efficiency, just about every IT manager is familiar with the concept of hot aisles and cold aisles. By directing proper air flow in and around racks of humming equipment, the enterprise is able to reduce operating expenses even as it increases utilization, and therefore heat generation, of key equipment.
What may not be widely known, however, is that there are numerous options when it comes to hot/cold designs, and what works for one facility may not be optimal, or even desirable, for another.
For example, some argue that the cold aisle containment portion of the equation may in fact be more crucial than hot aisle containment. According to Mark Hirst, head of T4 data center solutions at rack and cabinet designer Cannon Technologies, the difference comes down to the most effective use of cooling resources. Do you want cold air to go specifically toward data equipment, or do you want it to dissipate hot air in the room at large? Neither approach is wrong per se, although cold aisle containment does provide for faster cooling response in the event of sudden data spikes.
Regardless, you still need to get hot air away from critical systems, which is why the most common architectural component of the data center – the raised floor – is only partially effective, says XMission’s Grant Sperry. In too many cases, hot air is allowed to rise over the racks only to be drawn right back down again by ambient air flow. This is why IT managers need to consider adding hot air plenums in the ceiling that can pull waste heat into nearby air handlers without a lot of expensive duct work. At the same time, this provides for cooler air flow toward A/C compressors so they don’t have to work as hard to maintain a constant workable temperature in the racks.
Implementing these strategies, however, is usually impeded not by technology or cost, but by leadership, says Cary Frame, president of cooling solutions provider Polargy. Containment strategies affect both IT and facilities infrastructure, so if neither group is willing to take ownership of the problem it simply fails to be addressed. Since the eventual solution often involves mechanical, electrical and other components, someone has to be willing to coordinate a number of disciplines to make sure the real problems are being effectively addressed. And it’s not like there is an integrated vendor platform that can be deployed on a dime either.
Also note that air handling in the data center is not necessarily the most effective way to maintain operating environments or reduce costs, says Tech Republic’s Nick Heath. Simply consolidating data loads and shutting down under-utilized hardware often produces better results, depending on your data needs and infrastructure configuration, of course. Facebook and Google, for example, may eke out a few pennies per Gb with each new airflow design, but this can translate into big savings when applied to hyperscale facilities. A normal data center may find initial hot/cold aisle approaches to be very effective, but subsequent tweaks are likely to produce constantly diminishing returns.
The best course of action, then, would be to make a thorough assessment of environmental conditions throughout the data center. A range of software solutions are available from companies like Future Facilities and AdaptivCool that offer thermal imaging and even real-time manipulation of data center environmental conditions.
In the end, air handling is a function of out with the bad/in with the good, but it takes some fairly sophisticated know-how and probably a high degree of customization to make sure that the effort is worth the return.