SHARE
Facebook X Pinterest WhatsApp

IT Still Struggling with Server Utilization

Agile Federal Data Centers: The Drive to Thrive It’s probably going to be a fact of life for the enterprise industry going forward that the data center will be the perennial target of energy conservation groups and the environmental lobbyists. There is some justification for this scrutiny, of course, given that data facilities are among […]

Written By
thumbnail
Arthur Cole
Arthur Cole
Aug 29, 2014
Slide Show

Agile Federal Data Centers: The Drive to Thrive

It’s probably going to be a fact of life for the enterprise industry going forward that the data center will be the perennial target of energy conservation groups and the environmental lobbyists.

There is some justification for this scrutiny, of course, given that data facilities are among the single largest consumers of electricity on the planet. But the fact remains that the world economy is now firmly dependent on digital data and that those volumes, and therefore energy consumption, are only going to increase. That means energy efficiency will have to remain front and center if humanity hopes to maintain the benefits that digital infrastructure provides.

This is why the latest report from the Natural Resources Defense Council is so troubling. The group, which is certainly biased toward conservation, don’t get me wrong, estimates that upwards of 30 percent of data center servers in use today are completely idle but still drawing power, and those that remain utilize on average just 12 to 18 percent of total capacity. All told, the data industry can still trim about 40 percent of its energy consumption, representing about $3.8 billion per year— that’s nearly 40 billion kilowatt hours or roughly the output of 12 coal-fired power plants.

Some may quibble with the actual numbers, but low server utilization rates have long been an acknowledged problem among enterprise professionals. But what is the best way to confront it? Advanced monitoring and control are among the top solutions, says Greenbiz.com’s Heather Clancy. As more infrastructure is virtualized, the ability to rapidly shift loads between systems is enhanced, but this can only be done if IT has improved visibility into the relationships between data patterns and energy consumption. As well, increased public disclosure of energy performance and the use of green data service contracts through the Green Grid and other groups can also be highly effective.

Many of the leading energy initiatives are coming from the hyperscale community, even though this represents only a tiny fraction of the overall data industry. Facebook, for example, has introduced a new power-centric load balancing tool called Autoscale that helps shuttle data loads to the smallest possible hardware footprint. Normally, loads are distributed among server nodes using a round-robin approach in which each server is tasked with equal CPU demand. This might work for periods of heavy traffic, but is woefully inefficient when the load drops. Autoscale concentrates loads on select machines until utilization reaches a certain point, usually about half, allowing other devices to idle or be repurposed for low-level batch processing or other functions.

More than just new technology, however, improved utilization requires a new mindset among IT executives. The tendency to overprovision resources at the initial build is a problem that needs to be addressed, says tech blogger Mark Monroe. Forgetting for the moment that overprovisioning drives up the capital budget, it also locks the enterprise into a static hardware model at a time when advances in performance and efficiency are moving at a rapid pace. So by grabbing all it can now, IT ends up hurting itself in the long run by diminishing performance and driving up costs due to the inefficiencies inherent in aging infrastructure. A much better approach is to build for appropriate densities now, but plan for future growth using low-cost modular systems or readily available cloud resources.

It should be clear by now that data center efficiency is a journey, not a destination. And even if efficiency does improve dramatically over the next few years, the relentless increase in data loads all but ensures that overall consumption will continue to rise.

When that comes to pass, perhaps it will be time to shift the conversation away from how data is provided, onto who is using it, and why.

Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.

Recommended for you...

Top Data Lake Solutions for 2022
Aminu Abdullahi
Jul 19, 2022
Top ETL Tools 2022
Collins Ayuya
Jul 14, 2022
Snowflake vs. Databricks: Big Data Platform Comparison
Surajdeep Singh
Jul 14, 2022
Identify Where Your Information Is Vulnerable Using Data Flow Diagrams
Jillian Koskie
Jun 22, 2022
IT Business Edge Logo

The go-to resource for IT professionals from all corners of the tech world looking for cutting edge technology solutions that solve their unique business challenges. We aim to help these professionals grow their knowledge base and authority in their field with the top news and trends in the technology space.

Property of TechnologyAdvice. © 2025 TechnologyAdvice. All Rights Reserved

Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.