I am the weak link in the real-time data chain. That's because there's a delay -- sometimes a significant one -- between my brain processing information and telling the rest of my body what it needs to do. Not only that, but too often my body can't perform the task exactly as desired. And sometimes my brain overrides its original decision. "Nah, it won't hurt to have another piece of cheesecake."
I fear companies are having and will continue to have similar problems with all of the data they are collecting. It takes time and effort to run reports and get them to the relevant people and additional time for those people to tell the relevant systems what to do, assuming the right systems are in place.
What companies need, writes Dan Woods on Forbes, is something he calls operational intelligence. This isn't exactly a fresh term. I've been reading and writing about operational business intelligence or real-time business intelligence for a couple years now. The idea, says Brown, is for operational intelligence to monitor and combine streams of data from multiple sources, including data warehouses, Web-based services, video, geo-spatial data and streaming sources. Sofftware from companies like QlikView, Vitria Technologies and TIBCO Spitfire is beginning to make this possible. Seeing really is believing, he writes:
Multiple data sets can be related, combined and displayed. As new questions are asked, graphics change immediately. The narrative is visual. Most executives can operate the displays themselves, eliminating time-consuming requests for new reports. Such technology is changing the paradigm for senior meetings. Instead of a fat packet of reports, a display is at the center of an evolving conversation.
But two challenges are standing in the way, says Wood. I alluded to them in my first paragraph. Companies need to combine incoming data into an integrated form that can be navigated interactively. (In theory, the way your brain does.) Then, they need the ability to take appropriate action based on what they discovered, either through process automation or collaboration software such as wikis or blogs, suggests Wood.
IT Business Edge's Mike Vizard wrote about this last week, pointing out that complex events processing (CEP) will likely be part of the solution. While CEP doesn't completely remove the need for custom coding, it makes it simpler for companies to deploy real-time applications. As often happens in technology discussions, however, companies haven't really stopped to consider how the emergence of real-time (or let's be realistic, near real-time) applications will impact their usual business processes. Writes Vizard:
So the question is, are organizations really up to the task of conducting business in real time, or is this another example of where the technology is again out in front of the business reality?
Among the processes that will be impacted most by the emergence of real-time applications are those designed to ensure data quality. IT Business Edge's Loraine Lawson conveys this quite effectively in her interview with Sarah Burnett, a senior research analyst with The Butler Group. Said Burnett:
There are many data quality software tools that can be used to improve the quality of data that comes into the BI or CPM (corporate performance management) system. But there is more to data quality than just tools. Data quality should be built into processes so that data is correctly captured and stored, that errors are not introduced in other processes that use the data, and that the data is integrated, i.e., brought together from different systems so that the information that it provides can be compared and contrasted to provide intelligence: Where is Widget X selling the best? Is it actually selling at a profit?