Are GPUs the Future of IT?

Arthur Cole

It's unfortunate, but it shouldn't come as any surprise that Intel and NVIDIA had to take their dispute over integrated memory controllers to court. That's because the issue is much larger than who gets to use what technology for their chipsets.


The bigger question is whether the future of the processor industry belongs to the CPU or the graphics processing unit (GPU).


The dispute erupted over NVIDIA's stance that its earlier licensing agreements allow it to sell chipsets supporting the new Direct Media Interface (DMI) technology, which Intel uses on the new Core i7 (Nehalem) processors. This turns out to be a problem for Intel considering that many integrators are finding that NVIDIA's implementation of the technology is more powerful than Intel's, even for non-graphics-heavy applications.


The two companies have been trying to work out their differences for more than a year now, and at the same time hoping to avoid escalating the conflict to the point that it starts to affect their numerous other business relationships. In the end, Intel made the move to let the State Chancery Court in Delaware decide the matter.


That decision could have a significant impact on Intel's future. The company has laid out a far-reaching CPU development program, one that envisions a steadily increasing core-count per chip that will hopefully position the company well for the high-speed, high-capacity cloud environments of the not-too-distant future. If users come to feel that they are better served by the throughput and acceleration capabilities of GPUs over CPUs, and the court decides NVIDIA does have the right to integrate DMI technology in its chipsets, that will deal a major blow to Intel's position as the backbone of the IT industry.


But is the GPU necessarily better? Word is that, at the moment at least, it helps out in some circumstances, but not in others. This report on TG Daily says that Adobe users gain a lot from graphics cards that offer up to 1 Tflop performance, but limited bandwidth in the surrounding hardware makes it impractical for general-purpose use at the moment. That could change, however, in future generations, particularly if operating systems like OS X implement an improved hardware interface.


For the time being, then, it looks like the CPU will continue to rule the roost. But integrated graphics has always been one of the weak points in Intel silicon. If the company truly wants to maintain its position as the computer industry general-purpose chip supplier, it might want to consider paying greater attention to graphics processing in its development program.



Add Comment      Leave a comment on this blog post

Post a comment

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

null
null

 

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.