Heat is a form of energy, and energy is a commodity. And commodities, of course, can be sold for a profit.
So it is something of a misnomer to say that data centers are constantly dealing with the problem of waste heat when what is really going on is that they are failing to capitalize on their heat-generating capabilities.
But a few are starting to realize the commercial possibilities of the heat coming off the server racks. Probably the most innovative is the Foundry Project in Cleveland, Ohio, which is pumping heat from an underground data center to a $4.5 million co-located fish farm devoted to raising Mediterranean sea bass. The data center itself will measure about 40,000 square feet and is linked by three 100Gbps fiber networks. Foundry executives say they already have a client lined up but have yet to reveal a name. Meanwhile, the fish farm is expected to produce about 500,000 pounds per year, and waste from the fish will be delivered to a nearby orchard as fertilizer.
Other organizations, particularly those in the hyperscale set, are turning their excess heat on themselves to regulate the temperatures of their own corporate campuses. Amazon, for example, will use air from its Westin Building in Seattle to heat water that will then be pumped to a series of nearby biodomes to help regulate temperatures there. Once cooled, the water will be returned to the data center as a coolant and the cycle repeats itself.
And organizations in the market for a new supercomputer may be able to defray the cost in the form of lower energy bills with HP’s newest Apollo 8000 model. The system features more than 30,000 Intel Xeons capable of hitting 1.19 petaflops. Its most recent deployment at the National Renewable Energy Laboratory has helped reduce heating costs by 74 percent to produce annual savings of more than $1 million.
All of these projects have one thing in common: The heat generated by the data center is used for on-site or nearby facilities. But what if it could be diverted to a wider area? In the case of a Dutch company called Nerdalize, the plan is not to distribute the heat, but the heat source. The company has developed a radiator that captures heat from a single server to be used for home heating. This produces a distributed server architecture, of course, which leads to issues regarding latency and integration, but at only 500 euros a pop, every homeowner in the world would have the capability to provide data infrastructure for themselves and others, and give themselves a substantial heat source as well.
And even the idea that hot air cannot be stored and shipped is starting to erode. Companies like Highview Power Storage are already providing commercial services based on liquid air energy storage (LAES), in which air is liquefied at sub-zero temperatures so it can be stored, shipped and then evaporated as needed to produce energy. The liquefaction process requires energy up front, of course, so it is likely to be economical only in highly scaled environments. But it is certainly better than simply pumping hot air outside and it does, in fact, produce a greener source of energy for large coal- or oil-fueled electricity turbines.
But as infrastructure becomes denser, temperatures go up. And most organizations will find that it is easier and more profitable to leverage their heat rather than to fight against it.
Arthur Cole writes about infrastructure for IT Business Edge. Cole has been covering the high-tech media and computing industries for more than 20 years, having served as editor of TV Technology, Video Technology News, Internet News and Multimedia Weekly. His contributions have appeared in Communications Today and Enterprise Networking Planet and as web content for numerous high-tech clients like TwinStrata, Carpathia and NetMagic.