It's pretty basic to say that you can't expect success with business intelligence and other data-based initiatives without a focus on data quality. That's a given. But recently, it seems I'm seeing more focus on the role data integration plays in creating quality data, and therefore, in supporting success with BI.
Of course, this could be a function of focus. Maybe I'm just noticing it more because I'm paying attention. But an upcoming presentation, sponsored by The Data Warehousing Institute, makes me think otherwise.
The webinar will feature Danette McGilvray, president and principal consultant at Granite Falls Consulting and author of "Executing Data Quality Projects: Ten Steps to Quality Data and Trusted Information."
BI and data warehousing projects require data to be integrated and fulfill multiple reporting requirements. In the process of integrating this data, companies often uncover data-quality issues, which in turn complicate your BI and data warehousing projects, causing delays and increased costs.
Specifically, McGilvray will address how data-quality issues typically result from data integration, how to incorporate data quality throughout the project= and how data integration and quality can support or thwart strategic business goals.
The free event is scheduled for Wednesday, Sept. 16, at noon ET. It's sponsored by SAP.
For more on how data quality and integration affect enterprise-wide inititiaves, check out these IT Business Edge resources: