IBM Moves to Scale Deep Learning

    Last night, IBM made an interesting announcement. IBM Labs had come up with a new software method to massively reduce the increased inefficiencies when adding GPUs to a deep learning process. Using a highly tuned combination of NVLink and InfiniBand, they effectively took what was a wall and turned it into a big door. This effort showcases the very real benefits of a dedicated R&D function, the powerful partnership with NVIDIA, and that IBM’s focus on Cognitive Computing is likely to start paying off well.

    Let’s start with the announcement and work into the other areas this week.

    Distributed Deep Learning (DDL) Library

    Hooking into deep learning platforms like TensorFlow and Caffe, IBM’s announcement of a new Distributed Deep Learning Library has the potential to change the related landscape significantly. At issue was a common problem that while deep learning frameworks could run well on a single server, you hit a performance wall if you tried to expand beyond that. Therefore, deep learning efforts were often measured in days when the market was setting a requirement that was closer to seconds.

    Deep learning is used in critical areas like fraud detection, identifying military risks, identification of diseases and remedies, and increasingly for security at places like airports and borders. While inference engines handle the actual exposures, they must be trained for these exposures rapidly. But if it takes days to train the deep learning system they depend on, by the time they are ready for the threat, the damage may have already been done.

    What IBMs DDL does is allow you to expand beyond a single server and to massive numbers of GPUs, which then can solve problems in hours, or even minutes, and eventually seconds. The cost would be relatively high, but given this level of threat, that cost could still be a bargain when taken against the damage that could subsequently be more effectively mitigated or even eliminated.

    While this technology will be applied to Watson at some future date, it is being provided to IBM customers so they can build their own custom deep learning systems.


    This is a reminder of how important a dedicated lab function is within a company. This dedicated research focus allows some of the most talented people in a firm to focus on strategic problems like scaling deep learning systems until they find a solution. Granted, sometimes the breakthroughs seem to come few and far between, but when they hit, they have the potential, if applied, to change the market and vastly improve the firm’s position in it.

    This concept of a dedicated lab has fallen out of favor of late. Other firms tend to operate on a strategy of acquiring firms to get ideas, but there is a ton of extra cost and risk in buying a different company, and few do it well. Having the capability inside the firm to create offerings like this one can be far cheaper than taking the risk of making an expensive bad acquisition.

    NVIDIA Partnership

    IBM couldn’t do this without NVIDIA and the resulting offering should not only help IBM sell more deep learning servers but significantly increase the number of NVIDIA GPUs being used for deep learning. This is the way partnerships should work, where both firms benefit together. This collective revenue benefit tends to forge the firms closer together and place higher emphasis on sustaining the partnership on the related executive staff. So often, there are partnerships where one partner is basically a sales channel for the other, and the result is something that is neither lasting nor all that beneficial to either of them. But, in this case, this solution requires both parties to be engaged, and scaling out sells a blended IBM/NVIDIA product.

    That’s one of the things that makes this partnership so fascinating; this is an unusually tight partnership and one both firms likely should use as a template for others.

    Cognitive Computing

    IBM and NVIDIA both seemed to grasp early on that thinking machines were going to be the next huge thing. Both firms pivoted hard toward the opportunity and are working in a variety of spaces to make this concept of intelligent systems a reality.

    This is the nature of focus. The problem of trying to build platforms for everything is that they are typically not ideal for anything, and you are at the mercy of whoever is dominant in the space. But, if you focus on a segment, you can often build a better product and your limited resources, in that segment, can overmatch the dominant vendor, which must cover all segments.

    Cognitive Computing was a gamble for IBM because the full market hadn’t emerged yet when it made the pivot. But, the firm hung in there, and thanks to efforts like this, IBM has created the impression of segment leadership.

    Wrapping Up: Deep Learning’s Full Potential

    Moving to scale deep learning systems is one of the major steps needed to mainstream the technology and have it grow to its full potential. This potentially takes us initially from days to hours in response times and well down the path toward seconds. Once we reach seconds, and systems can learn in real time, we’ll see a rate of change in the related markets we can’t even imagine now.

    IBM has a decent shot at owning the emerging segment but only because it focused early and created the kind of deep partnership with NVIDIA that was needed to get the job done. Oh, and having IBM Labs was the hidden ace that appears to also have paid off massively for the firm. It is nice to see a strategic plan come together like this because, these days, few firms can spell strategic. IBM remains one of those few.


    Rob Enderle is President and Principal Analyst of the Enderle Group, a forward-looking emerging technology advisory firm.  With over 30 years’ experience in emerging technologies, he has provided regional and global companies with guidance in how to better target customer needs; create new business opportunities; anticipate technology changes; select vendors and products; and present their products in the best possible light. Rob covers the technology industry broadly. Before founding the Enderle Group, Rob was the Senior Research Fellow for Forrester Research and the Giga Information Group, and held senior positions at IBM and ROLM. Follow Rob on Twitter @enderle, on Facebook and on Google+

    Rob Enderle
    Rob Enderle
    As President and Principal Analyst of the Enderle Group, Rob provides regional and global companies with guidance in how to create credible dialogue with the market, target customer needs, create new business opportunities, anticipate technology changes, select vendors and products, and practice zero dollar marketing. For over 20 years Rob has worked for and with companies like Microsoft, HP, IBM, Dell, Toshiba, Gateway, Sony, USAA, Texas Instruments, AMD, Intel, Credit Suisse First Boston, ROLM, and Siemens.

    Latest Articles