You’ve probably heard the old saw that goes, “There’s an easy way to do something, and there’s a hard way.”https://o1.qnsr.com/log/p.gif?;n=203;c=204663295;s=11915;x=7936;f=201904081034270;u=j;z=TIMESTAMP;a=20410779;e=iWhen it comes to energy consumption in the data center, the saying is technically correct, although there is a caveat: The hard way is not necessarily wrong and the easy way is not necessarily the lazy way out. Instead, there is room for both major overhauls to data infrastructure, like CDIM and converged systems, and small moves, like turning out the lights and raising the thermostat.
In fact, says Energy Manager Today’s Carl Weinschenk (also an IT Business Edge contributor), even the easy way can add up considerably over time. One of the newest trends making the rounds is to paint data equipment white, which, according to DAMAC’s Dave Johnson, requires less lighting to make units visible to IT techs and, consequently, less heat from those light sources. As well, creating as little as an inch-and-a-half of space behind devices in the rack can vastly improve air-flow and simplify cabling.
So many data centers are still poorly laid out, however, which is why Upsite Technologies has designated June as Airflow Management Awareness Month. The idea is to draw attention to the significant savings that can result from proper airflow management using a series of webinars and other events that seek to replace the myths of data center cooling with science. The firm has already addressed airflow best practices and data center containment issues and is slated to take on IT operations and proper AC measurement techniques. As it stands now, the U.S. data center industry alone is projected to draw about 140 billion kilowatt hours per year by 2020, generating roughly 100 million metric tons of carbon into the atmosphere.
Another relatively easy fix for high energy bills is to reassess the extent to which modern hardware must be overprovisioned, says Datacenter Knowledge’s Yevgeniy Sverdlik. A recent report by data center management firm Coolan showed the impact that poor utilization has on power consumption, particularly in clustered environments. While 80 percent power utilization is considered optimal, few clusters approach even half that, which simply means that organizations are not only running more data hardware than they need, but more power and cooling systems as well. To address this, organizations need to take a hard look at the workload distribution across the cluster and the supply and redundancy characteristics of current power and cooling infrastructure.
And for facilities that operate in difficult climates where renewable power supplies and free cooling sources are scarce, those excuses might not cut it for much longer. When it comes to heat and humidity, few locations are as unpleasant as Singapore, but even that country has launched an energy-saving program to see how performance can be maintained without cranking up the AC. The Tropical Data Centre (TDC) program is aiming for a 40 percent reduction by allowing heat to exceed the normal 77 degrees F threshold and pushing relative ambient humidity above 60 percent. The program is backed by HPE, Dell, Huawei and others, with a proof-of-concept facility planned later this summer.
As noted above, however, none of these “low-hanging fruit” options should dissuade the enterprise from making bigger changes to data infrastructure. Advances in automation, analytics and architecture will go a long way toward greening up the data center, but only if they are incorporated as core elements in the systems refresh cycle. In the meantime, IT should do all it can to keep the power draw to a minimum.
Energy efficiency should not come at the expense of reliability, availability and other factors, of course, but by this time both data and power infrastructure have advanced to the point this is no longer an either/or decision.
Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata and Carpathia. Follow Art on Twitter @acole602.