I think it's safe to say that never in all the ages of this old earth have people produced so much data.
There's already around 100 GB of data created for every single person on this planet-the equivalent of 24 tons of books and 1 trillion CDs, according to a recent Reuters article.
It's a bit ironic then, that our information also has the shortest lifespan in history. Experts say technology changes so quickly, our digital data has a mere five- to 20-year lifespan, depending on how the digital information is stored. The European Union loses 3 billion euros (approximately US$3.7 billion) worth of information every year, according to the article.
In 50 years, we might not be able to access Stephen Hawking's digitally stored notes, yet we'll still be able to read the cave paintings of ancient man, decipher the hieroglyphics of Egyptian pyramids and flip through Einstein's paper notebooks.
Humbling, isn't it?
The point of the article is that European scientists and archivists are storing a "digital genome" deep in the Alps in an effort to ensure future generations can decipher digital data. This is more than a problem for historians, however. It could become a true business issue. As Andreas Rauber, a professor at the University of Technology of Vienna, explained:
In 25 years, people will be astonished to see how little time must pass to render data carriers unusable because they break or because you don't have the devices anymore. The second shock will probably be what fraction of the objects we can't use or access in 25 years and that's hard to predict.
It seems to me that this situation is a strong argument for data standards, particularly in health care and other critical records.
Admittedly, it's ridiculously hard to get organizations to embrace standards, for several reasons.
First, as Dr. Doug Fridsma recently pointed out to me, "The wonderful thing about standards is there are so many to choose from." Fridsma knows the struggle of standards first-hand: He is overseeing the effort to sift through standards for electronic medical records as acting director of the Office of Interoperability and Standards in the Office of the National Coordinator for Health Information Technology. (It's important to note that this is not a standards body, but rather an open effort to sift through the existing standards to see what works and what doesn't.
Second, organizations don't want to have to rip out working systems to comply with a standard. Bill Kenealy, a senior editor at Insurance Networking, pointed out this challenge in a recent feature article on insurance data standards. Companies are afraid of the upfront expense of learning and complying with standards, the article notes.
Kenealy also identifies a third reason organizations are skittish about standards: Right or wrong, they believe standards are too complex.
It's interesting to compare the standards effort by the insurance industry with the government's effort to agree on standards for national health care records.
The government has a big carrot to help with ensuring compliance to standards: It's offering incentive payments for electronic health care records-assuming you can show "meaningful use" and certification, which is largely based on compliance with standards.
ACORD, the standards body for the insurance agency, has no such obvious carrot. It's relying in part on the more intrinsic arguments for standards-efficiency, long-term savings, interoperability.
Standards are also getting a boost from the maturity of XML, the adoption of SOA and Web services, and the push for analytics, the article notes. There's also an interesting grass-roots movement in the insurance sector. A number of big insurance companies are trying to make the ACORD standards "plug and play" - an interesting concept that I wish the article had further explained.
I'd be interested to hear more on what has been done to promote the adoption of data standards.