Here's a nifty news development: A North Carolina company says it has developed a way to use nanotechnology to cool computer chips.
According to this AP article from Forbes, the company applies a thin layer of bismuth telluride to the copper pillar bumps used to attach chips to their packages. The film of bismuth telluride pulls heat away from the chip's hot spots and the bumps act as an aid in the cooling.
Nextreme claims this method reduces temperatures by as much as 60 degrees Celsius, but the article adds hot spots only need 5 to 15 degrees of cooling. Nextreme says its approach, which can be used with current manufacturing methods, could make multi-core processors unnecessary.
None of the chip makers will comment on Nextreme's announcement, but the article notes that Nextreme's chief technology officer is a former Intel senior scientist.
Nextreme also revealed a product that uses the new cooling method - which it refers to as "thin-film thermal bump technology" - in a new thermoelectric module called the Ultra-High Packing Fraction (UPF) OptoCooler module. I have no idea what that means, but apparently it's used for laser diode, LED and advanced sensor products. You can read the press release here.