There's a lot of chatter in IT circles these days about the amount of power being consumed. In fact, some projections suggest that IT will consume as much as 40 percent of all the electricity generated by 2030.
Obviously, that isn't sustainable. And nor is it likely to happen, says Don Newell, CTO for server products at Advanced Micro Devices.
The IT industry just recently started tackling energy efficiency as an engineering problem, says Newell. There are a number of advancements on the horizon, ranging from how data centers are powered and cooled, to new memory technologies and advances in processor architectures.
AMD, for example, is working on Bulldozer, a next-generation processor due in 2011 that will deliver up to 16 cores in a processor.
In addition, advancements in systems-management software will increasingly focus on reducing power consumption.
Overall, says Newell, there will be much more emphasis in the future on calculating the amount of power-per-dollar spent per watt generated in all aspects of IT.
Unfortunately, things look like they may get out of hand based on where the industry is at the moment. Until recently, the industry as a whole has pretty much ignored the power issue. But Newell says you can expect power consumption to be at the forefront of every major IT innovation from here on out.