How Bad Is Health Care Data Quality?

Loraine Lawson

How bad is the quality of clinical health care data? There are certainly issues, but it seems very few organizations actually know for sure.

In a recent post, Informatica’s Chief Healthcare Strategist Richard Cramer writes about testifying to a Health IT Policy committee work group.

That’s important because the Office of the National Coordinator for Health Information Technology, U.S. Department of Health and Human Services, oversees that committee. So eventually, this information finds its way to Congress.

He was one of a number of vendor representatives who shared their experience with the group last fall. What’s significant, though, is that in listening to the other experts, Cramer found his experience repeated time and time again.

The problem with access to the quality of clinical data is twofold, it seems.

First, few health care organizations have even invested in data quality. Remember how other industries bought data quality tools and then learned they had to back track and learn about data quality as a discipline? Health care hasn’t even invested in the tools, much less the best practices, according to Cramer’s post — although I grant you, he said it much more eloquently.

He’s hopeful, however, that this means health care will learn from other industries and pursue data quality as a discipline.

“Done right, a data quality program doesn’t end but gathers momentum,” he told the working group. “Having the governance, stewardship, and tools that enable a continuous process of profiling data, monitoring quality and resolving discrepancies is an essential component of addressing the current and future challenge of ensuring the quality of clinical data as a key asset.”

But here’s the catch: Before health care can even go about getting data quality wrong or right, it needs to be able to share data across systems. And right now, that’s a very big problem, mostly because of proprietary data structures or contractual issues that keep the data from being shared, Cramer and others said.

“Although database managers are working hard and doing the best they can, they are all-too-frequently constrained by limited access to their own data within vendor applications,” Cramer writes.

The good news is that meaningful use seems to be moving organizations in the right direction, he continues.

For some reason, Cramer was afraid his comments would be controversial, but as I’ve shared in the past, it’s no secret that health care IT faces a major interoperability problem. And that’s why it’s pretty great that people like Cramer and experts from other industries are calling it out, saying, “We have the technology to fix this in other industries - let us help.”

And fixing this interoperability/integration problem is exactly what Stage Two of Meaningful Use will be about, according to John D. Halamka, an MD and Chief Information Officer of Beth Israel Deaconess Medical Center.

Halamka also chairs the New England Healthcare Exchange Network (NEHEN), and co-chairs the HIT Standards Committee. Somehow, he also finds time to be a professor at Harvard Medical School and a practicing emergency physician, according to his blog.

In a December post, Halamka said 2014 is the year we’ll start to see health care data interoperability become a reality.

Technically, Stage Two only requires that providers can exchange clinical care summaries on 10 percent of their patients. But as Halamka points out, when it comes to technology, 10 percent is not that far from 100 percent interoperability.

“The application and infrastructure investment necessary to support 10% is not much different than 100%,” he writes. “The 10% requirement will bring most professionals and hospitals to the tipping point where information exchange will be implemented at scale, rapidly accelerating data liquidity.”

He also talks about the VA’s Automate Blue Button initiative, which is all about giving patients the ability to share their care records with the touch of a Blue Button. It’s pretty cool, actually, when you consider that thus far, the only way to share clinical data is by a fax machine or a patient bringing in hard copies.

And what’s really neat is that it’s patient-focused, giving the patient control over sharing data.

“With certified technology, standards, and incentives to share data among providers and patients, 2013-2014 will usher in a new era of interoperability,” Halamka writes.

So it seems it’ll be at least a year, maybe two, before health care IT is really in a place to assess data quality. Still, interoperability is real progress, and I think we’re all ready for it.

If you’d like to read more on standards, check out this statement to a House subcommittee by Dr. Farzad Mostashari, the national coordinator for the Health Information Technology, U.S. Department of Health and Human Services. It provides more detail about Stage 2 and what the IT Standards Committee is recommending for interoperability.

Add Comment      Leave a comment on this blog post
Apr 7, 2013 9:02 AM Prasanth Prasanth  says:
You mentioned ... The problem with access to the quality of clinical data is twofold, it seems. First, few health care organizations have even invested in data quality. ----------------- What is the second problem ? Reply
Aug 19, 2015 11:53 PM Jessie Chimni Jessie Chimni  says:
The entire enterprise architecture is flawed! Would you buy a car that would not tell you it's health - speed, temperature, service status, fuel, RPM, tire pressure, etc.? Similarly, how can u buy enterprise applications that cannot measure the quality of the data - the key asset that is essential for all business operations! Organizations have to embark on data quality initiatives because the solutions are not out of the box. All have to be custom built. Reply

Post a comment





(Maximum characters: 1200). You have 1200 characters left.



Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.