The Future of Advanced Analytics

Michael Vizard
Slide Show

Big Data Analytics

The first steps toward achieving a lasting competitive edge with Big Data analytics.

Fundamental change is coming to the way analytics applications are processed. In terms of processing, analytics applications have been making due with compute engines that were largely designed to process transactions. But as analytics becomes a core capability that drives actual business decisions, Brenda Dietrich, CTO for business analytics at IBM, says the time to starting treating it as a distinct type of application workload is at hand.

Speaking at an IBM Smarter Analytics and Big Data Leadership Summit in New York this week, Dietrich said IBM is in the early stages of working on hardware systems that will be specifically designed to optimize the processing of analytics data. This capability is needed, she said, because in the not-too-distant future, analytics will become the dominant type of application workload processed by IT.

At the rate we're on, it's apparent that IT organizations will need systems that can scale from terabytes to exabytes, she added.

Perhaps most interesting of all, Dietrich says that by 2015, about 80 percent of all the data processed by IT will be to one degree or another uncertain. This is because the data that is collected is generally incomplete, ambiguous, in conflict with other data, being used in models that are flawed or simply not available at the time it's needed. This fact of life, says Dietrich, means the successful application of analytics in the future will require not only tools that can make sense of all that uncertain data, but entirely new data management techniques. The end goal, says Dietrich, should be to deliver analytics applications that can confidently make decisions on behalf of the organization based on the data - no matter how flawed - being collected.

With the advent of social media, inexpensive sensor, voice-over-IP and any number of other applications and systems that stream data, Dietrich says the number of "nodes" creating data is increasing exponentially. The challenge facing IT these days, she says, is to find ways to make it easier to create and develop applications that make sense of all that data, and then deliver it in a way that is actually consumable by the average person. That thinking is partially what went into the development of a new Signature series of analytics systems that are optimized to specific functions, such as identifying fraud, the next-best action an organization should take or key performance metrics that the chief financial officer should be tracking.

It may not be apparent just yet, but we're actually witnessing the beginning of some significant changes to the way IT computer architectures process analytics as a distinct class of applications. As that change occurs, the analytics applications themselves will evolve in ways that may leave us viewing the current generation of analytics applications and the systems they run on as the digital equivalent of crude tools from the Stone Age.

Add Comment      Leave a comment on this blog post

Post a comment





(Maximum characters: 1200). You have 1200 characters left.



Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.