Energy Efficiency: How Green Is Green?

Arthur Cole
Slide Show

Six Tips for a Greener Data Center

Create better airflow in the data center.

Nothing shakes up the IT blogosphere like a good controversy. And we got one this week courtesy of The New York Times.

It seems that Times reporter James Glanz wanted to shed some light on the power utilization rates the industry has achieved so far, concluding that it was at a paltry 7 to 12 percent — meaning that upwards of 90 percent of the 30 billion watts that data centers consume worldwide is wasted. To make matters worse, many of the top facilities maintain their own power-generating capabilities, usually fueled by distinctly non-eco-friendly diesel engines. As culprits in all this, Glanz points to a combination of insatiable demand from users, a risk-averse culture in IT that values redundancy over efficiency, and a basic failure on the part of many enterprises to truly understand how their data infrastructure is behaving.

Cue the blowback. Dan Woods of CITO Research blasted the piece for its "many glaring misconceptions, omissions, and distortions," the most egregious of which is the failure to distinguish between traditional IT and the new Internet industry. Old-line infrastructure in hospitals, the financial industry and manufacturing may be lagging in the efficiency race, but the truly large data centers used by Google, Facebook and the like are as state-of-the-art as you can get. It's also unfair, he says, to hold IT to a higher efficiency standard than, say newspapers, which require reams of paper for readers who may only read two or three articles per issue.

Meanwhile, Forrester's Richard Fichera had a few bones to pick with the Time's energy utilization numbers, which he described as technically correct but misleading nonetheless. By only calculating server energy consumption, the Times discounts the role that storage and networking play in data processing. Add those two elements and the utilization figure could jump to 38 percent or more — not a great number, to be sure, but a more accurate depiction of data center efficiency.

At its heart, this debate revolves around the concept of "greenness." In short, what level of efficiency is appropriate for an industry that has emerged as the lifeblood of modern life as we know it? There are those who would argue that success will never arrive because there is always something that can be done to reduce consumption without sacrificing performance. True, but at what cost? And are the developments we've seen to date truly having an impact on the cost/benefit analysis?

A case in point is virtualization. Hailed as the savior of data environments because it can help one server do the job of 10, the fact is that even virtualization cannot produce efficiency on its own. As we all know by now, too much virtualization in the server farm quickly overloads storage and networking capabilities, which must be virtualized themselves or built out on the hardware level (read, consume more energy) to free up the bottlenecks that ultimately hamper data performance. It's for these and other reasons that data centers continue to struggle with energy efficiency despite virtual server deployments of 30 percent or more.


It also should be noted that it's taken decades to create the data center universe as we know it, and it can't be changed on a dime just because economic and cultural shifts suddenly call for greater efficiency. There are those who will continue to say the industry is doing too little to cut the power demand, while others argue that expectations are too high.

Like it or not, though, demand for instant data anytime, anywhere is here to stay. And that means the drive to design, build and maintain energy-efficient infrastructure will only increase.



Add Comment      Leave a comment on this blog post
Oct 2, 2012 8:01 PM MarksThougts MarksThougts  says:
What I see are a great many who are simply asking the wrong questions. It is not about efficiency directly or whether one kind of data center is better than another. The real question is how does this achieve the progress we require as a society? Taken another way, at the turn of the 19th to the 20th century a 60 watt light bulb cost well over $25 to purchase and about $1,400 to run the bulb for the equivalent of a year - in our dollars. The demand for the combined products of light and electrical power was apparent; manufacturers and others could run night shifts - and with the demand the total costs for the product achieved what we see today - pennies for the bulb and pennies to run it. We, as a nation, should not be looking at how to reduce the demand for electrical power or the products that consume it - we should be looking for ways to provide many times the demand we can afford today without killing the planet we live upon. The increase in demand feeds the competition of suppliers who, in turn, look to meet the demand. Look around you. Not long ago it was very, very expensive to own and use a mobile phone - we all have one now. Conservations are mostly nuts. Reply

Post a comment

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

null
null

 

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.