Talk of the green data center has primarily centered around server virtualization as a means to consolidate hardware and draw less energy.
Problem is, with most data centers well on the way toward fully virtual environments, that well could run dry fairly quickly.
So the question naturally becomes: Where else can we find redundancies that are driving up the power bill? And the answer is, not surprisingly, storage.
The tools needed to cut power requirements in the storage farm have been around for some time, but only lately have they been seeing significant deployments. The goal of tools like data deduplication, thin provisioning and capacity management are twofold, according to Storage Switzerland's George Crump. They help you run storage more efficiently so fewer drives are active to meet required performance levels, and they reduce to need to acquire and deploy storage, which has an impact on both capital and operating budgets. He says that some of the latest technologies, if deployed correctly and depending on the current state of you storage infrastructure, can cut storage power requirements in half.
The biggest problem in deploying current energy-saving solutions is that, like server virtualization, they tend to consolidate hardware into denser and denser configurations, according to Illuminata's John Webster and CA's Chris Stakutis. The result is invariably greater heat generation, which in turn requires more energy to keep systems from melting down. They recommend a crash course in "storage environmentals" -- that is, the means to measure and monitor overall power consumption, preferably in real time. This will require a series of sensor devices tied directly to monitoring software so you can drill down into actual consumption patterns and predict the consequences to additional consolidation or deployments. Fortunately, they say much of this technology is readily available and easily implemented.
You'll also have to get familiar with many of the new measurement techniques and standards coming out of the SNIA, EPA and other groups, according to CIOL. Depending on the size of your infrastructure, you may or may not be getting the savings you had hoped based on Gbs/W, IOPS/W and other metrics. The EPA, in particular, is fond of PUE, power usage effectiveness, which it calculates as the total power draw across the data center over the total consumption of all IT-related equipment. A rating of 1 is ideal, while 2.5 is considered poor. The thinkers behind these numbers estimate that 1.6 is a good target.
If you don't mind filling out a subscription to GigaOM Pro, a good place to start researching the ins and outs of green storage is this report from Analytico President Tom Trainer. He covers the basics on technologies like thin provisioning, information lifecycle management and flash storage and even goes into how all of this will be affected by the advent of cloud-based storage.
Green storage may or may not produce the kind of results we are seeing with server consolidation, but in most cases, it will result in a net gain in energy efficiency, that is, after the cost of deploying any or all of the new green technologies. The timing is also right, given that energy costs are low and demand for green systems is lower than it otherwise would be.
And naturally, the return only increases if the price of oil kicks up again.