Earlier this month, Jonathan G. Koomey published a report on the Growth in Data Center Electricity Use 2005 to 2010. Koomey, a consulting professor in the civil and environmental engineering department at Stanford University, showed that electricity used in U.S. data centers was significantly lower than predicted by the EPA's 2007 report to Congress.
Though the EPA expected electricity use in data centers worldwide and in the U.S. to double from 2005 to 2010, the report found it increased only 56 percent and 36 percent, respectively. Koomey attributed the lower-than-expected rates to a significant reduction in the number of installed servers. This, he found, was the result of a struggling economy, as well as technological improvements, such as server virtualization that enables one server to replace multiple servers.
While Koomey's findings provide valid reasoning for the decline, the reduction in energy use may also be attributed to cooling efficiencies not mentioned in the report. One factor in the efficiency equation could be linked to the American Society of Heating, Refrigeration, and Air Conditioning Engineers (ASHRAE) expansion of the recommended environmental envelope.
In 2004, ASHRAE's low-end and high-end temperature recommendations were 68F and 77F, respectively. In 2008, the organization expanded those recommendations to 64.4F and 80.6F. (I know from my own personal experience, there was a time when IT administrators wanted their data centers to feel like meat lockers. But, over the last few years, data center set points have been steadily climbing.)
ASHRAE's expansions opened the door for economical and innovative methods of cooling, such as free cooling and indirect evaporative cooling. Given the fact that cooling can account for 40-50 percent of the total amount of energy used in data centers, improving efficiencies in this area can have a significant impact on the overall operating costs.
From a business standpoint, lowering energy use provides data centers with a prime opportunity to generate more revenue - especially in an industry where power constraints are a growing trend. Most data centers run out of power before they run out of real estate. Lowering the energy used to cool a data center frees up energy and funds that can then be put toward additional IT equipment. It also helps alleviate the need for expansion projects by allowing more IT equipment to operate within the constraints of the existing center.
Several large corporations have already turned to innovative green cooling technologies - such as indirect evaporative cooling or evaporative cooling - to lower their overall electricity usage in their data centers.
One case in point is Google, which operates 2 percent of the world servers, but uses less than 1 percent of the electricity reported in Koomey's study. (This amount was confirmed by a Google executive.) That's clearly the result of making energy efficiency a corporate priority.
Google isn't alone. Facebook uses new evaporative cooling methods in its newest data center. And, SuperNAP Colocation in Las Vegas is keeping up with the pack by cooling its 2.2 million sq. ft. of data space with hybrid indirect evaporative cooling technology.
But, cutting costs through innovative means isn't just for the big dogs. Green cooling technology is now being scaled down to help data centers of all sizes become more efficient.
Green House Data, a colocation in Cheyenne, WY, cut its energy use for cooling by 90 percent using indirect evaporative cooling technology developed by a Colorado-based manufacturer that specializes in energy-efficient air conditioning systems.
Shawn Mills, owner and CEO of Green House Data, said the new cooling method developed by Coolerado helped reduce his total building energy consumption by 40 percent.
By using these types of innovative cooling technologies, data centers can continue to lower their energy usage and overall operating costs. The results will provide additional means of generating revenue and will help reduce our collective energy consumption.