More

    Strange Ideas Emerging to Improve Data Center Cooling

    Slide Show

    Five Ways to Realize Server Room Profitability

    Data center infrastructure and architecture may change over the years, but one thing remains constant: the need to lower operating costs.

    Aside from personnel expenses, the biggest factor in the opex side these days is power and cooling, and even the power side of the equation is diminishing rapidly due to advances in data/energy management in new low-power systems. That leaves cooling as one of the key cost factors in the data center – a problem that is likely to grow more acute as virtualization and software-defined architectures lead to increased hardware density.

    It is surprising, then, that so many enterprises fail to take even rudimentary steps toward improving the energy/cooling ratio, let alone some of the more advanced technologies designed to reduce the consumption of non-data-related systems in the data center. As Upsite Technologies’ Lars Strong explained to Data Center Knowledge recently, simple fixes like removing perforated tiles in hot aisles and whitespace or redirecting air flow under raised floors can go a long way toward reducing utility bills. As well, cracked seals in server racks and floor openings can waste valuable cold air, as can poorly calibrated temperature and humidity controls. Even empty cabinet spaces can lead to overcooling and imbalanced air flow across the rack and surrounding areas.

    Cooling efficiency should be a top priority across the board, but newly built facilities have an opportunity to drive down cooling costs at the source: the energy needed to produce cold air or water in the first place. Some organizations are going to extreme measures when it comes to so-called “free cooling,” with many heading toward temperate or even arctic climates, but some are finding ready sources of coolant in unexpected places, like natural gas facilities. It turns out that much of the cold needed to turn convert gas to its liquid form for storage is wasted, and companies like TeraCool are hoping to tap these sources for data facilities. LNG is normally stored at -260 degrees F, which produces a significant amount of cold vapor when the liquid is converted back to gas. Using a series of containment loops, TeraCool says it can deliver this coolant to an on-site data center or, alternately, to an electricity turbine.

    Amazon’s James Hamilton has an even more far-out idea – far out to sea, that is. His plan would place a data center atop an offshore desalinization plant, which typically pulls sea water from depths of 100 feet or more in order to avoid plankton and other life forms that can clog up filtration systems. The trouble is, cold water is more difficult to desalinate than warm water, so rather than simply heating it up through conventional means, why not run it past a few server racks first? Not only does the data center get free cooling, but the plant itself can shave millions off its operational budget by using the warmer feed water. A company called DeepWater Desal is already on board with a plan that would place a 150 MW data center on its proposed facility in California’s Monterey Bay.

    Data Center Cooling

    Cooling costs can also be reduced by turning the heat up, rather than struggling to keep the A/C low. NEC Corp. recently released the Express5800 series server, built on the new Xeon E5-2600 v3 processor. Along with DDR 4 memory, built-in Hyper-V support and a host of other bells and whistles, the server is rated for up to 104 degrees F, which is well above the normal operating range of most data centers. It’s probably not advisable to kick the thermostat up this high as it would likely cause your human operators to fail faster than your hardware, but it does offer the possibility of perhaps an 80-degree server room that can be serviced with only basic air flow techniques.

    As I mentioned earlier, cooling becomes even more challenging once hardware densities increase, and those densities are already hitting critical mass in hyperscale settings. So it’s reasonable to expect even more innovative cooling solutions to come from web-scale providers like Google and Amazon. Some of these may trickle down to the average facility, some won’t, but one thing is certain: Enterprise executives will not have the luxury of treating cooling as an afterthought for much longer.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles