More

    IT Still Struggling with Server Utilization

    Slide Show

    Agile Federal Data Centers: The Drive to Thrive

    It’s probably going to be a fact of life for the enterprise industry going forward that the data center will be the perennial target of energy conservation groups and the environmental lobbyists.

    There is some justification for this scrutiny, of course, given that data facilities are among the single largest consumers of electricity on the planet. But the fact remains that the world economy is now firmly dependent on digital data and that those volumes, and therefore energy consumption, are only going to increase. That means energy efficiency will have to remain front and center if humanity hopes to maintain the benefits that digital infrastructure provides.

    This is why the latest report from the Natural Resources Defense Council is so troubling. The group, which is certainly biased toward conservation, don’t get me wrong, estimates that upwards of 30 percent of data center servers in use today are completely idle but still drawing power, and those that remain utilize on average just 12 to 18 percent of total capacity. All told, the data industry can still trim about 40 percent of its energy consumption, representing about $3.8 billion per year— that’s nearly 40 billion kilowatt hours or roughly the output of 12 coal-fired power plants.

    Some may quibble with the actual numbers, but low server utilization rates have long been an acknowledged problem among enterprise professionals. But what is the best way to confront it? Advanced monitoring and control are among the top solutions, says Greenbiz.com’s Heather Clancy. As more infrastructure is virtualized, the ability to rapidly shift loads between systems is enhanced, but this can only be done if IT has improved visibility into the relationships between data patterns and energy consumption. As well, increased public disclosure of energy performance and the use of green data service contracts through the Green Grid and other groups can also be highly effective.

    Many of the leading energy initiatives are coming from the hyperscale community, even though this represents only a tiny fraction of the overall data industry. Facebook, for example, has introduced a new power-centric load balancing tool called Autoscale that helps shuttle data loads to the smallest possible hardware footprint. Normally, loads are distributed among server nodes using a round-robin approach in which each server is tasked with equal CPU demand. This might work for periods of heavy traffic, but is woefully inefficient when the load drops. Autoscale concentrates loads on select machines until utilization reaches a certain point, usually about half, allowing other devices to idle or be repurposed for low-level batch processing or other functions.

    More than just new technology, however, improved utilization requires a new mindset among IT executives. The tendency to overprovision resources at the initial build is a problem that needs to be addressed, says tech blogger Mark Monroe. Forgetting for the moment that overprovisioning drives up the capital budget, it also locks the enterprise into a static hardware model at a time when advances in performance and efficiency are moving at a rapid pace. So by grabbing all it can now, IT ends up hurting itself in the long run by diminishing performance and driving up costs due to the inefficiencies inherent in aging infrastructure. A much better approach is to build for appropriate densities now, but plan for future growth using low-cost modular systems or readily available cloud resources.

    It should be clear by now that data center efficiency is a journey, not a destination. And even if efficiency does improve dramatically over the next few years, the relentless increase in data loads all but ensures that overall consumption will continue to rise.

    When that comes to pass, perhaps it will be time to shift the conversation away from how data is provided, onto who is using it, and why.

    Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.

    Arthur Cole
    Arthur Cole
    With more than 20 years of experience in technology journalism, Arthur has written on the rise of everything from the first digital video editing platforms to virtualization, advanced cloud architectures and the Internet of Things. He is a regular contributor to IT Business Edge and Enterprise Networking Planet and provides blog posts and other web content to numerous company web sites in the high-tech and data communications industries.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles