SHARE
Facebook X Pinterest WhatsApp

Data Quality for the Rest of Us

Tracing the Evolution of Business Intelligence Through the Rise of Embedded Analytics By now, most of us are familiar with data quality “best practices.” Involve the business user. Correct the source. Establish data governance. It sounds great—but it often falls flat in the real world. Why? It’s too difficult, states Lyndsay Wise, president and founder […]

Written By
thumbnail
Loraine Lawson
Loraine Lawson
Oct 4, 2013
Slide Show

Tracing the Evolution of Business Intelligence Through the Rise of Embedded Analytics

By now, most of us are familiar with data quality “best practices.” Involve the business user. Correct the source. Establish data governance.

It sounds great—but it often falls flat in the real world. Why?

It’s too difficult, states Lyndsay Wise, president and founder of the independent research and analysis BI firm, WiseAnalytics.

“Many operational systems were developed years or even decades ago without processes for correcting inaccurate and inconsistent data entries,” Wise wrote in a recent TechTarget article. “A majority of companies have yet to implement master data management programs and systems that use master reference data to help identify and fix quality issues as data is entered into systems.”

As a result, many companies find addressing data quality best practices translates into high costs and a lot of work by in-house or consultant developers.

A more practical approach is to address data quality at the BI and data warehouse layer, Wise contends, rather than trying to solve data quality in siloed operational systems.

She also says many business users seem to think that BI systems magically correct data quality problems. Not true, she points out. She suggests one of two approaches for cleaning up your data:

  • A central team applies the data quality tools as part of the integration process when loading the data into a central database.
  •  A central team applies the data quality tools after the data is moved into a centralized database, but before it enters the BI layer.

“Managing data quality activities in one place should result in lower costs compared with modifying data in individual source systems,” Wise adds.

It’s not a best practice. Data quality expert and Obsessive-Data Quality blogger Jim Harris would probably call this “reactive data quality,” which you may need to do to repair immediate damage.

Harris has compared this to everything from tooth decay to dog whispering, but the theme is the same: When there’s no data governance or data quality at the source, you never address the root of the problem. And eventually, that causes more problems.

Recommended for you...

How Revolutionary Are Meta’s AI Efforts?
Kashyap Vyas
Aug 8, 2022
Data Lake Strategy Options: From Self-Service to Full-Service
Chad Kime
Aug 8, 2022
What’s New With Google Vertex AI?
Kashyap Vyas
Jul 26, 2022
Data Lake vs. Data Warehouse: What’s the Difference?
Aminu Abdullahi
Jul 25, 2022
IT Business Edge Logo

The go-to resource for IT professionals from all corners of the tech world looking for cutting edge technology solutions that solve their unique business challenges. We aim to help these professionals grow their knowledge base and authority in their field with the top news and trends in the technology space.

Property of TechnologyAdvice. © 2025 TechnologyAdvice. All Rights Reserved

Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.