Could New Cooling Method Make Multi-Core Processors Irrelevant?

Loraine Lawson

Here's a nifty news development: A North Carolina company says it has developed a way to use nanotechnology to cool computer chips.


According to this AP article from Forbes, the company applies a thin layer of bismuth telluride to the copper pillar bumps used to attach chips to their packages. The film of bismuth telluride pulls heat away from the chip's hot spots and the bumps act as an aid in the cooling.


Nextreme claims this method reduces temperatures by as much as 60 degrees Celsius, but the article adds hot spots only need 5 to 15 degrees of cooling. Nextreme says its approach, which can be used with current manufacturing methods, could make multi-core processors unnecessary.


None of the chip makers will comment on Nextreme's announcement, but the article notes that Nextreme's chief technology officer is a former Intel senior scientist.


Nextreme also revealed a product that uses the new cooling method - which it refers to as "thin-film thermal bump technology" - in a new thermoelectric module called the Ultra-High Packing Fraction (UPF) OptoCooler module. I have no idea what that means, but apparently it's used for laser diode, LED and advanced sensor products. You can read the press release here.

Add Comment      Leave a comment on this blog post
Jan 1, 2011 1:02 AM Brett Clavier Brett Clavier  says:

I'm not sure how this makes multi-core processors "irrelevant". I see it making them more efficient and able to run at higher clock speeds, but irrelevant, I think they are being a bit ambitious.


Post a comment





(Maximum characters: 1200). You have 1200 characters left.



Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.