12 Summer Data Center Cooling Tips

Julius Neudorfer
Those lazy, hazy days of summer are upon us and many small and midsize firms will see their data centers' cooling systems pushed to their limits, and even far beyond. So you may want to consider putting sunburn lotion on your servers or you can use some of these cooling tips to keep them from overheating.

This is especially true for rooms located in mixed-use buildings that are not using large dedicated cooling systems with enough extra capacity for those very hot summer days. Many IT departments are 'sweating' out the summer (again), hoping that they will not have servers suddenly crashing from over-temperature shutdowns.

Here are a few tips, tricks and techniques that may not solve the long-term problem, but may help enough to get you through the summer. Many times, when the actual capacity of the cooling system is not severely exceeded by the actual heat load of the equipment, optimizing the airflow may improve the situation until a new or additional cooling system is installed:

1. If it feels warm, don't panic - even if you see 80�F, in the cold aisle! Yes, while this is hotter than the proverbial 70-72�F data center 'standard' you were used to (and you may not enjoy working in the room); however, it may not be as bad for the servers as you think. Take temperature measurements at the front of the servers. This is where the servers draw in the cool air and is really the only valid and most important measurement. Take readings at the top, middle and bottom of the front of the racks (assuming that you have a Hot Aisle/Cold Aisle layout). 

2. If the bottom areas of the racks are cooler, and you have open rack space, try to rearrange the servers nearer the bottom (or coolest area) of the racks. If the highest temperature reading in the front of the rack is 80�F or less, you are still within ASHRAE's TC 9.9 latest 'Recommended' guidelines. Even if the intake temperature is somewhat higher (up to 90�F), it is still within the 'Allowable' guidelines.

3. Make sure that you use blanking panels to block off ANY and ALL open unused spaces in the front of the racks. This will prevent hot air from the rear re-circulating into the front of the racks.

4. Don't worry about rear temperatures - even if they are at 100�F or more! Do not place random fans blowing at the rear of racks to 'cool them down.' This just causes more mixing of warm air into the cold aisles. (I wish I had a dollar for every time I have seen this.)

5. If you have a raised floor, make sure that the floor grates or perforated tiles are properly located in front of where the hottest racks are. If necessary, rearrange or change to different floor grates to match the airflow to the heat load. Be careful not to locate floor grates too close to the CRACs, this will 'Short Circuit' the cool airflow immediately back into the CRACs and rob the rest of the room/row of sufficient cool air.

6. Check the raised floor for openings inside the cabinets. Cable openings in the floor allow air to escape the raised floor plenum where it is not needed, and lowers the available cold air to the floor vents in the cold aisles. Use air containment brush-type collar kits to minimize this problem.

7. If possible, try to re-distribute and evenly spread the heat loads into every rack to avoid or minimize "Hot Spots." Remember, check the temperature in the racks at the top, middle and bottom, before you move the servers, and relocate the warmer servers (again based on the front temperatures) to a cooler area. Then use blanking panels to fill any gaps. Recheck all the rack temperatures again to make sure that you have not just created new hot areas.

8. Check the rear of racks for cables blocking exhaust airflow. This will cause excessive back pressure for the IT equipment fans and can cause the equipment to overheat - even when there is enough cool air in front. This is especially true of racks full of 1U servers with a lot of long power cords and network cabling. Consider purchasing shorter (1-2 feet) power cords and replacing the original, longer OEM cords shipped with most servers. Also use the shortest possible network cables as well. Use cable management to unclutter the rear of the rack so that the air flow is not impeded.

9. If you have an overhead ducted cooling system, make sure that the cool air outlets are directly over the front of the racks and the return ducts are over the rear of the racks. I have seen sites where the ceiling vents and returns are poorly located, the room is very hot, yet the capacity of the cooling system has not been exceeded because the cool air is not getting to the front of the racks. The most important issue is to make sure the hot air from the rear of the cabinets can get directly back to the CRAC return, without mixing with the cold air. If you have a plenum ceiling consider using it to capture the warm air and add a ducted collar going into the ceiling from your CRAC's top return air intake. Some basic duct work will have an immediate impact on the room temperature. In fact, the warmer the return air, the higher the actual cooling capacity of the CRAC.

10. Consider adding temporary 'roll-in' type cooling units only if you can exhaust the heat into an external area. Running the exhaust ducts into a ceiling that goes back to the CRAC does not work. The heat exhaust ducts of the roll-in must exhaust into an area outside of the controlled space.

11. When the room is not occupied, turn off the lights. This can save 1-3 percent of electrical and heat load, which in a marginal cooling situation, may lower the temperature 1-2 degrees. 

12. Check to see if there is any equipment that is still plugged in and powered up, but is no longer in production. This is a fairly common occurrence and has an easy fix, just shut it off!

The Bottom Line

While there is no true quick fix when your heat load totally exceeds your cooling system's capacity, sometimes just improving the airflow may increase the overall efficiency 5-20 percent. This may get you through the hottest days, until you can upgrade your cooling systems. In any event, it will lower your energy costs, which is always a good thing.

Plan ahead. If all else fails, have a fall-back plan to shut down the least-critical systems, so that the more critical servers can remain operational (i.e. email-financial, etc.).  Make sure to locate the most critical systems in the coolest area. This is better than having the most critical systems unexpectedly shutdown from overheating.

This way you may actually be able to enjoy the weekend on the beach with your pi�a colada, instead of worrying if you will start getting (or perhaps not getting) high temperature warning email messages on your smartphone, which you may then need to quickly update your resume.

Add Comment      Leave a comment on this blog post
Feb 2, 2012 9:02 AM short prom dresses short prom dresses  says:
Thanks for your tips and it is really useful!! Reply
Mar 9, 2012 9:03 AM guducrid guducrid  says:
Cable management includes clamping, labeling and routing. There are various products to bundle cables together: cable management Reply
Mar 10, 2012 5:03 AM guducrid guducrid  says:
Cable management includes clamping, labeling and routing. There are various products to bundle cables together: cable management Reply
Mar 10, 2012 5:03 AM guducrid guducrid  says:
Cable management includes clamping, labeling and routing. There are various products to bundle cables together: cable management Reply
Jan 17, 2013 1:08 AM johnsmith johnsmith  says:
Really great info about the data center cooling tips. Agreed with the point that always use cable management to unclutter the rear of the rack so that the air flow is not impeded. Looking forward to more such tips. Reply

Post a comment





(Maximum characters: 1200). You have 1200 characters left.



Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.