More

    New Approaches to IT Efficiency

    Slide Show

    Six Steps to Achieve Enterprise Energy Intelligence

    Virtually everyone is in favor of an energy-efficient data center. But if that is the case, why has the industry struggled so mightily to reduce power consumption?

    Even with the remarkable gains in virtualization and other advanced architectures, the data center remains one of the primary energy consumers on the planet, and even worse, a top cost-center for the business.

    But the options for driving greater efficiency in the data center are multiplying by the day – from low-power, scale-out hardware to advanced infrastructure and facilities management software to new forms of power generation and storage. As well, there is the option to offload infrastructure completely to the cloud and refocus IT around service and application delivery, in which case things like power consumption and efficiency become someone else’s problem.

    Somewhere along the data chain, however, electrons have to encounter physical resources, and driving the efficacy of that interaction will be a key function of emerging data center designs. The growing field of Data Center Infrastructure Management (DCIM) is putting a wealth of tools at the enterprise’s disposal, such as the new Cooling: Optimize module in Schneider Electric’s StruxureWare platform. Based on technology from cooling specialist Vigilent, the system utilizes a series of sensors and machine-learning control software to monitor temperature conditions and then adjust AC equipment according to need. Integrating the system into the broader StruxureWare platform is intended to lower upfront costs and produce better long-term efficiencies for an improved TCO over the data center lifecycle.

    As the market matures, however, it is becoming clear that efficiency will not improve through technology alone. Rather, it requires a multi-pronged approach that incorporates tools, practices and the mindsets of key data personnel. The Uptime Institute recently combined its various efficiency programs into an official Stamp of Approval that seeks to measure success by outcomes rather than initiatives. In this more holistic view, the group seeks to improve on many of the underlying causes of data inefficiency, namely those that fall under leadership, operations and design. Participating organizations receive either a two-year Approved stamp or a one-year Activated stamp, with benchmarks spanning efforts in planning, decision-making and actions, as well as asset utilization and lifecycle management across data infrastructure. So far, the group has issued stamps to Kaiser Permanente and Mexico’s CEMEX.

    Data Efficiency

    As well, leading research organizations are crafting new data center models that stress efficiency of operations as a function of overall performance. The U.S. National Science Foundation Center for Energy-Smart Electronic Systems (ES2) at New York’s Binghamton University recently took a look at data center modeling techniques and found that software approaches like computational fluid dynamics are a good way to start, but fine-tuning existing facilities is much more effective when empirical data on that specific environment is utilized. This requires in-depth airflow measurement across racks, aisles and even ductwork and conduits, and even minor alterations like floor jacks and cutouts. It also helps to return a room to its native state by shutting down equipment and normalizing air pressure to get a better handle as to how the working production environment affects environmental conditions.

    As I’ve mentioned in the past, power efficiency is a never-ending struggle. Systems can never be too efficient or too green, and like house-cleaning, few people notice all the work that has been done, just the parts that are still dirty.

    And the sad fact is that efficiency gains will likely diminish over time, requiring more effort for less result. But as infrastructure scales up and out, even small gains could very well translate to big savings to the operational bottom line.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles