More

    What IBM’s New Chip Means for Data

    Slide Show

    In-Memory: Speeding Up Value by Using Operational Intelligence

    By now, you’ve heard all the hoopla over IBM’s new brain-like chip. There’s little doubt that this is significant chip innovation, but what interests me is what this new development means for data.

    Most of the news has focused on the similarities between SyNapse’s TrueNorth and the human brain. Actually, as revealed last week, the technology represents 16 million neuron chips, which is a good deal short of the 100 billion neurons in the human brain, according to the UK’s University of Manchester Computer Engineering Professor Steve Furber.

    Furber was a co-designer of the original ARM processor chip in the 1980s. For the past three years, he has worked on a project that would model 1 billion neurons, according to the UK Register.

    “Sixteen million neurons is roughly the scale of a frog brain,” Professor Furber told the UK Register. “So, the IBM board may be able to catch a fly for its dinner.”

    So, what does the equivalent of a frog’s brain get you when it comes to data? As it turns out, a frog’s brain can fill a pretty significant technology hole.

    The real benefit of TrueNorth isn’t the power, as the MIT Review notes that conventional microprocessors can outdo it on data crunching, but the ability to handle unstructured data: images, sounds and other sensory data. Ars Technica has a more technical explanation and illustration of how it does this, but the MIT Review simply explains that the chip doesn’t separate memory and processing blocks. Instead, the neurons and synapses “intertwine” those functions and that, it turns out, also changes how the chip handles data:

    “When data is fed into a SyNapse chip it causes a stream of spikes, and its neurons react with a storm of further spikes. … it doesn’t work on data in a linear sequence of operations; individual neurons simply fire when the spikes they receive from other neurons cause them to.”

    So, essentially, the chip can process more complex data — even massive amounts of data. It can also identify patterns, much like the human brain, and unlike existing technology. It also uses very little power.

    But here’s what didn’t make too many stories, despite possibly being the best part: It can process that data in real time, according to Reuters.

    It looks like this chip isn’t just a major technology breakthrough, but a significant game changer for using data from the Internet of Things or, as CNET points out, other, bigger things.

    And you can bet IBM knows it.

    “Once commercialized, such a chip could act as a low-power sensor for a range of embedded and portable devices. It could become the silicon brain for the ‘Internet of things,’” Dharmendra Modha, IBM Research fellow and chief scientist, told PC World. “It could transform the mobile experience as we know it.”

    Loraine Lawson is a veteran technology reporter and blogger. She currently writes the Integration blog for IT Business Edge, which covers all aspects of integration technology, including data governance and best practices. She has also covered IT/Business Alignment and IT Security for IT Business Edge. Before becoming a freelance writer, Lawson worked at TechRepublic as a site editor and writer, covering mobile, IT management, IT security and other technology trends. Previously, she was a webmaster at the Kentucky Transportation Cabinet and a newspaper journalist. Follow Lawson at Google+ and on Twitter.

    Loraine Lawson
    Loraine Lawson
    Loraine Lawson is a freelance writer specializing in technology and business issues, including integration, health care IT, cloud and Big Data.

    Get the Free Newsletter!

    Subscribe to Daily Tech Insider for top news, trends, and analysis.

    Latest Articles