For Valentine’s Day, Talend published a fun infographic, “Use Big Data to Secure the Love of Your Customers.” It lists data quality as the second leading challenge with Big Data, but perhaps more striking is the $13.3 million annual financial impact caused by data quality problems.
I’m not entirely sure from the graphic which research group provided that stat, but a 2013 Gartner research paper put the cost higher, at $14.2 million a year.
Actually, there’s no shortage of scary statistics and numbers on the high cost of bad data. For instance, this infographic by Lemonly.com and Software AG notes that bad data:
Numbers like that are impressive, but perhaps overwhelmingly vague. What does it mean to say data quality “costs” an organization 10-30 percent of revenues? How is that determined? And will fixing it cost more?
A recent piece by Dell Boomi CTO Michael Morton puts some specifics to data quality’s cost by looking at how bad data changes a hypothetical direct mail marketing campaign.
In his example, you’ll spend $20,000 each month on a campaign to “a highly targeted set of 100,000 customers in a specific market.” Even if you exclude employee time and operational costs, the mailing requires $240,000 for the year.
In previous years, this mailing has generated revenue of about $500,000. So, you’re profiting $260,000 each year from sales.
How does poor data quality change the equation? Morton assumes an error rate of 50 percent, which he notes is not uncommon for enterprise sales and marketing departments. That’s bad data in the form of duplicate entries, incorrect addresses, outdated lists and other errors.
Because 50 percent of your mailings are lost, you’ve cut the revenue in half — making your profits for that program a mere $10,000. Add in employee time and other operational costs, and you’re most likely losing money, he adds.
So, it’s easy to imagine how the average company could waste “$180,000 per year simply on direct mail that does not reach the intended recipient because of inaccurate data,” as Lemonly.com claims.
That’s just looking at one small area, but it’s not hard to imagine how that might create problems with customer service, safety recalls or suppliers.
Data quality can also add costs to IT systems such as Configuration Management Database (CMDB) and content management systems, according to a recent blog post by vendor Effectual Systems. The vendor estimates that these systems contain between 50-75 percent junk data. Fixing it manually can add up quickly, even if you’re only dealing with 1,000 changes a month. In their example, that can cost IT $600,000 and 12,000 man-hours per year.
If you’d really like to pinpoint the cost of bad data at your own organization, Gartner’s Ted Friedman offers more specifics in this IT Business Edge interview, “How to Measure the Cost of Data Quality Problems.”
Loraine Lawson is a veteran technology reporter and blogger. She currently writes the Integration blog for IT Business Edge, which covers all aspects of integration technology, including data governance and best practices. She has also covered IT/Business Alignment and IT Security for IT Business Edge. Before becoming a freelance writer, Lawson worked at TechRepublic as a site editor and writer, covering mobile, IT management, IT security and other technology trends. Previously, she was a webmaster at the Kentucky Transportation Cabinet and a newspaper journalist. Follow Lawson at Google+ and on Twitter.