Getting the Right Balance: Data Quality vs. Timeliness

Ann All

An interesting blog post by OpenBI co-founder Steve Miller on Information Management got me thinking about how IT's good intentions can sometimes backfire. In writing about the value of approximate BI, Smith shares a conversation he had with a conference attendee in which the business intelligence professional told him that his management's view was that a quickly produced and approximate answer was preferable to a precise but belated one.

 

IT spends lots of time and money on improving data quality in an effort to ensure it's as accurate as possible. Do business users appreciate this? Maybe not, given that one of the chief complaints about BI is that the right information isn't getting to the right people at the right time. Miller includes a statement from data science evangelist Mike Loukides:

Do you really care if you have 1,010 or 1,012 Twitter followers? Precision has an allure, but in most data-driven applications outside of finance, that allure is deceptive. Most data analysis is comparative: if you're asking whether sales to Northern Europe are increasing faster than sales to Southern Europe, you aren't concerned about the difference between 5.92 percent annual growth and 5.93 percent.

Miller also cites an excellent MIT Sloan Management Review article by "Competing on Analytics" author Tom Davenport and SAP co-CEO Jim Hagemann Snabe in which the two men make a point that perhaps should be strikingly obvious -- but isn't, as evidenced by how most companies approach BI: Not all information is needed equally as fast, nor in equally perfect condition. I wonder if many business users intuitively understand this idea, with that understanding at least in part driving the demand for more self-service BI tools.

 

Yet it's worth noting (as Davenport and Snabe do) that business often contributes to the problem, with executives asking for more information than they need to make decisions. In today's highly competitive business environment, it's important for executives to more accurately identify the information they need and the time frame in which they need to receive it. (Emphasis mine.) As one of the CEOs the men interviewed put it, in a best-case scenario companies would have "a timetable for each piece of information."

 

Once managers make an effort to do this, self-service tools can be quite beneficial. Write Davenport and Snabe:

... The net result of these technologies is that executives can do more of their own query and analysis, and need not wait more than a second or two for the results. That leads to an entirely different perspective on and process for information delivery. Consumption of the information is made more likely by the very fact that the manager has requested it; the information has been pulled rather than pushed. ...

Self-service tools are one of the technologies mentioned by Davenport and Snabe that can facilitate faster delivery of desired information. Among others they list are in-memory applications that allow rapid queries and interactive analysis and new types of databases that are optimized for query and reporting.


 

Yet IT will need to do more than simply getting executives to nail down the information they want to receive and when, and offering them more self-service options. While adding new technologies may be helpful, IT will ultimately need to introduce new processes and measures of cycle time and quality, write Davenport and Snabe. They offer examples of some companies that are doing this, including Procter & Gamble, which has re-oriented the primary purpose of its IT organization to emphasize the provision of information and analysis for decision making. P&G has even renamed its IT organization "Information and Decision Solutions."

 

Most companies are probably a long way from P&G's level of maturity. But just getting IT and business stakeholders together for a heart-to-heart on the relative importance of data quality as it relates to timely delivery of information should yield plenty of valuable insights and help companies develop a more strategic approach to BI.

 

A reader named Goeffrey P shared a concern in comments following Miller's blog post, noting that a "real risk with approximations is that speed becomes more important than accuracy-at which point the budget for Master Data Management and all forms of robust quality governance are cut because the perception is that accuracy is no longer a business requirement." It's a valid worry, but I think that just getting IT and business leaders together for a discussion like the one I mention above may actually help make the case for data governance.

It fits right in with the kind of broad view of data governance that IT Business Edge colleague Loraine Lawson wrote about in February, sharing Obsessive-Compulsive Data Quality blogger Jim Harris' definition:

From my perspective, the primary focus of data governance is the strategic alignment of people throughout the organization through the definition, and enforcement, of policies in relation to data access, data sharing, data quality, and effective data usage, all for the purposes of supporting critical business decisions and enabling optimal business performance.


Add Comment      Leave a comment on this blog post

Post a comment

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

null
null

 

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.