Can Competitors Unite as GPU Computing Ushers in a New Age?

Rob Enderle

Last week was Nvidia's first large developers conference for GPU computing. The San Jose Fairmont hotel had around 1,500 people attending the event, with the majority connected in some way to a GPU development effort. The excitement reminded me of the early days of computing, when people were thinking about innovation and doing impossible things. In fact, the word "doing the impossible" likely should have been a tagline tied to the event, given how often it was used.

 

However, a large part of what got CPU computing off the ground initially was broad industry collaboration. In the early days, Intel, AMD and IBM were on the same page in this regard. They were increasingly competitors, but that competition initially didn't get in the way of market growth. A lot depends on GPU computing being successful, but I wonder if the key players, both AMD and Nvidia, understand that to take this mainstream, they need to collaborate and partner until the market is strong enough to stand on its own.

 

Let's talk about the promise and problem of GPU computing.

 

GPU Computing: A New Age of Excitement and Wonder

Unlike CPU computing, which really had its birth in the workplace and was initially largely focused on spreadsheets, word processors, and databases, GPU computing had its birth with gaming and has jumped to special effects, large-scale modeling and massive data analysis. The first was exciting for its time, primarily because the tools the PC replaced weren't particularly easy to use and tended to be very limited and not intelligent at all. However, GPU computing is much more visually exciting and potentially will have a greater near-term impact on the quality of our lives.

 


For instance, Lucasfilm showcased how it is changing the way it makes movies, making the effects more realistic and dramatically improving the speed in which it gets things done. Much of the work is being done in medical research, which you might depend on for your life in the future. Simon Hayhurst from Adobe spoke of one engineer who was convinced that editing in real time was impossible and, as was the case often in stories told at the show, was dumbfounded when he discovered he could now do it. The impact wasn't what you'd expect. Yes, it did speed things up, but the existing batch process was so time intensive and expensive that artists and engineers had become afraid of trying new ideas because failures took too long to correct. They were discovering that they could be creative again, which suggests that the movie industry is about to see some massive improvements. (Granted, it will need to improve the storytelling as well, but that's not a technology problem at the moment.)

 

The number of applications being developed to do medical research and supercomputer-like scientific analysis was a bit overwhelming (a few examples here) and far beyond what I could understand. The talk on shaving monkey brains for detailed analysis was fascinating, for instance, though I'm not sure they really needed to share the pictures, which will be with me for a while.

 

An example of saving lives came from Sean Varah of MotionDSP, which takes a number of video frames and analyzes them in real time to get a massive amount of additional detail. This is used to catch criminals (to clear up blurry or distant images of faces and license plates) and to help the military by more accurately identifying threats. In the future, this same information will be used to create 3D images of events, eventually blurring the lines between reality and virtual reality. Each presenter, showcasing an excitement we haven't seen in the market for a while, testified that GPU computing allowed him or her to do things that couldn't be done before. They weren't abandoning the CPU -- they also indicated it was still critical to the process -- it's just that the big advances are coming from the other side, largely, I think, because the GPU simply has not been widely used like this in the past. But I wonder if this can truly scale to a new age of computing.

 

GPU Computing: Problems

To build a market, initially competitors often have to cooperate. Right now, I wonder if there is enough cooperation. Markets have trouble advancing if there are competing standards because developers don't knew where to build. OnLive showcased how this is a problem with gaming consoles, which are in decline at the moment for a lot of reasons, including that each has its own platform. It is interesting to note that the strongest platform at the moment -- defined by the richest set of games and game sales opportunities -- is the Xbox, which uses a code base common to the PC.

 

The industry is moving away from proprietary platforms to OpenCL (created by Apple, driven by the Khronos Group) and DirectCompute driven by Microsoft, which should pull the parties together. However, we are still very early in this process. Initially, Nvidia had Cuda and AMD/ATI Stream, which aren't compatible at all. That Apple and Microsoft are key to the common standards both companies now support is important, but giving up leadership to these two vendors that don't cooperate well with each other could be a problem long term. Right now, GPU computing is a huge green field where the majority of new opportunity comes at the expense of aging supercomputer hardware and includes a ready pool of people who simply have not had access to performance at this level and desperately need it. If the key players, at least initially, can put aside their differences and work together to open up this market, the result likely will be vastly more lucrative for all.

 

Wrapping Up: Competition vs. Cooperation

GPU computing, which is improving performance 10 to 1000 times in areas ranging from media creation to medical advancement, is on the cusp of transforming a major part of the computing industry. This is to supercomputers what PCs were to mainframes, and I doubt the world will ever be the same. Given the massive performance jumps being demonstrated, if you are in the distinct areas getting the largest focus and seeing the largest benefits (finance, medical/scientific research, military simulation/monitoring, and multimedia) and aren't on top of this, the world likely will pass you by very quickly.

 

I haven't seen potential like this since the early PC days, and this kind of change turns peasants into kings and kings into peasants. Saying "be careful out there" seems woefully inadequate at the moment.



Add Comment      Leave a comment on this blog post

Post a comment

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

 

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.


 
Resource centers

Business Intelligence

Business performance information for strategic and operational decision-making

SOA

SOA uses interoperable services grouped around business processes to ease data integration

Data Warehousing

Data warehousing helps companies make sense of their operational data


Thanks for your registration, follow us on our social networks to keep up-to-date