Anyone who has looked at the numbers realizes that the data center is between a rock and a hard place. Loads are going up, perhaps as much as 30 percent per year, while at the same time the cost of energy is on the rise now that the global economy seems to be on the mend.
For the IT industry, that means more resources going toward building and maintaining infrastructure, either at home or in the cloud, with no real guarantee that any of it will produce real value. Once Big Data analytics come on line, it may be that the vast majority of this data is, well, junk.
True value, of course, is in the eye of the beholder – just ask any modern art dealer – but at least we have ways to determine if the energy budget is going toward productive uses. A task force consisting of representatives from The Green Grid, the U.S. Department of Energy and various European and Asian government agencies recently unveiled their latest tool, the Data Center Energy Productivity (DCeP) metric, which promises to delve deeper into energy usage in data environments and whether it is being put to good use.
The metric is actually built on a number of earlier schemes, including the Power Usage Effectiveness (PUE) and Carbon Usage Effectiveness (CUE) metrics, which have come under criticism over the years as being overly broad and subject to manipulation. Power usage, after all, can be dropped to zero if you shut everything down, but then the facility won’t provide any value. The DCeP aims to combine usage patterns with detailed knowledge of the workload itself in order to give data center managers a peek into what their energy consumption is actually accomplishing and whether steps can be taken to improve productivity.
For those looking to find an industry-wide standard that can be used to compare one facility against another, the DCeP won’t suffice, however. Its designers readily admit that the “productivity” portion of the formula will be largely defined by the user, considering they are the only ones who can best determine what is and is not productive. That means the final measurement will vary widely across the industry, influenced not only by the amount of energy being consumed and the extent of virtualization and other low-energy architectures, but also by applications and the workloads themselves. Infrastructure dedicated to ecommerce or other Web-facing activities, for instance, will probably show markedly different results from back office or database systems, although both can be deemed efficient.
The question of how deep the enterprise needs to delve into the data load to view efficiency factors has also arisen. TSO Logic CEO Aaron Rallo says it will first require a detailed analysis of how data and applications relate to core business goals, followed by mechanisms that can track things like the power cost of individual transactions, the number of transactions per kilowatt-hour and revenue and utilization rates of individual servers. As daunting as this sounds, the IT management industry is clearly headed in this direction, with the idea that not only can this kind of granular information be gathered and analyzed, but also that automated systems will then be able to coordinate data and energy flows on the fly, producing an optimized energy/production environment.
Dreams of digital nirvana have been floating around the IT industry for years, but it seems that new technologies designed to solve existing problems inevitably lead to still more difficulties to be met by future generations.
Optimizing energy usage and data productivity is clearly a worthy goal, but it is not the kind of endeavor that lends itself to universal, uniform standards.