Just the Stats: No One Is Confident About Data's Quality

Loraine Lawson

This quarter, I've written a good deal about data quality. Along the way, I've compiled some recent statistics that I think say a lot about the status of data quality in organizations today.


Doubting the Data


From the numbers I ran across this past quarter, it's clear there's a good amount of insecurity and doubt about data's accuracy. Both IT and business users seem aware of the bad data problem, particularly missing or incomplete data, as these numbers reveal:


Nearly 90 percent. U.S. business and IT leaders who worry they're working with inaccurate information, according to an Experian QAS survey of 1,320 business representatives across industry sectors and located in the U.S., Europe, Asia and Australia.


83 percent. IT pros who said they do not have confidence in the accuracy of information stored on company databases, according to an Informatica Corporation poll of over 600 sales, marketing and IT professionals across Europe, the Middle East and Africa.


6 percent or more. Amount of databases estimated to have incorrect or missing contact data, according to 60 percent of those queried in a survey of insurance, financial and retail organizations by Experian QAS.


3. The three most common-and most harmful-errors in databases: incomplete/missing data, outdated information and incorrect data, according to the same survey.


25 percent-or more. Existing master data that's missing required fields or containing outdated information, according to an Aberdeen Group survey of 176 organizations.


Paying the Data Piper


Bad data isn't just an inconvenience. It costs time and money-and in enormous amounts of both, according to the statistics:


$700 billion. Amount that poor quality data costs U.S. businesses in inefficiencies and lost customers each year, according to Ovum's estimates.


$90,000. Amount spent per year on data management by the average respondent, according to the aforementioned Aberdeen Group report.


13 hours or more. Time to fix a single master data exception or error, according to the same survey. And, by the way, the average respondents estimated there were 71 series errors or exceptions for every 1000 master data records.

Seven decades. Time it would take one employee, working average work weeks, to fix all the errors in master data records, according to Aberdeen's estimates.


And yet ...


5.5. Rating between 1 (low) to 10 (high) senior management gave master data as a priority-thus, Aberdeen concludes, "revealing that in the upper corporate echelons there are still many who still don't recognize the business value of good data governance."




Data quality is slowly coming into its own as an important issue, however. That may explain why Gartner predicts this.


12 percent. Amount the data quality tools market will grow every year, for the next five years.

Add Comment      Leave a comment on this blog post

Post a comment





(Maximum characters: 1200). You have 1200 characters left.



Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.