When John Schmidt started his series on lean integration last week, he did so with this caveat:
"Based on my research, no-one has tackled this topic directly in the form of a paper or book. But the world is a big place, so if some of you readers have come across prior works, please let me know. In the meantime, you heard it here first!"
That very day, a very similar piece, "Lean techniques to help your data quality improvement initiative (Part 1: Time Value Maps)," was published on Data Quality Pro, an online community for data quality specialists. The piece was written by the site's founder, Dylan Jones.
It's a small online world after all.
Like Schmidt's post on lean integration, the Data Quality Pro post is part of an ongoing series. In the first article, Jones explained how time value maps can be applied to data quality projects, though the implications may extend into the business process:
"Data quality issues in the service chain can be a major cause of non-value added time creation. Most organizations 'waste' a major portion of their income to funding additional staff to handle complaints, chase missing records, deal with anomalies and generally clean up the mess poor data quality leaves behind. However, be prepared to uncover a whole range of additional business and technical issues as you unravel the complex service chains that are the fabric holding your business together."
This week, both series continue with new posts published Tuesday.
Schmidt's recent post is very accessible, but specific and concrete-an unusual combination for a tech article. He applies the lean practice of eliminating waste to data integration, finding four specific wasteful practices:
What constitutes variation in data integration? According to Schmidt, it's using "different middleware platforms, development tools, interchange protocols, data formats, interface specifications and metadata standards to name a few."
The new Jones post digs into how you can apply a lean quality tool called Little's Law to your data quality initiatives.
Six Sigma aficionados will recognize Little's Law as the Law of Velocity. However, for those of you as clueless as I am, Little's Law is a mathematical equation proofed by John Little of Case Western University. It related to queuing theory and can be applied to systems.
In lean quality initiatives, it's used to calculate the lead time for business services, according to Jones. Even more specifically, the post explains why it matters in data quality initiatives:
"This simple tool enables us to calculate the lead time for a business service. This in turn allows us to quickly identify areas where effective data quality management will help us deliver faster services, reduce costs and most importantly - delight customers. If you're looking to accelerate the time-to-benefit of your data quality initiative this may just be the vital ingredient you're looking for."
The article goes on to explain the formula and how you'd use it, with the focus on simple, quick real-world applications. As the article notes, you don't need to be any kind of black belt, Six Sigma or otherwise, to apply these tools, reflecting a concept that Jones advocated in a comment to last week's IT Business Edge blog post
"The key thing for me is making people realize that waste is a really simple route to free profit. So many businesses go looking for new business to increase revenue or new IT innovations to cut costs when the profits are ripe for the picking if they just scrapped all their bad practices that incur massive costs. There is no need for big 5 consultants or mega-investment, low-hanging fruit is all around us."