“We need open data to change the world,” reads a recent headline on the Harvard Business Review blog, by Lucy Bernholz.
Bernholz is a visiting scholar at Stanford University's Center on Philanthropy and Civil Society, where she co-leads a research project focused on philanthropy, technology and policy. She highlights two non-profits that rely on the concept of open data for their success.
“Courtesy of broadband Internet and mobile wi-fi we now have global, low-cost ways to share data that every social-purpose organization can plug into,” she writes. “From it, they should be able to draw what they need about previous experiments, effective practices, and achieved outcomes, as well as what has failed and why. That way, local social entrepreneurs can mimic what works elsewhere and share their ideas, successes, and failures.”
Judging from the descriptions, I think she’s thinking open information more than what technologists typically call “open data,” since one of the non-profits, the Awesome Foundation, actually seems to be less about data than about the open sharing of ideas.
The second non-profit, CrisisCommons, does use open data to help response organizations, volunteers and governments during natural disasters and other emergencies. Alas, the website doesn’t offer specifics, other than to note that the project does involve geospatial data. But it’s clear they mean actual datasets and not just ideas or proposals.
Bernholz sees open data as an unexplored asset for non-profits. “If we're going to scale any of our efforts to solve social problems we've got to make much better use of the fastest scaling tool humans have ever built: open data,” she writes.
Her piece opens up a broader discussion on how open data can create new models, not just for non-profits, but all organizations. And part of that discussion will have to involve standards.
No one questioned his right under the law to that information: Everyone concurred that he was legally entitled to come down to the office and print it off. But he wanted it in an electronic format, culled from the county’s database.
The problem, as the Sunlight Foundation shared, was the government couldn’t easily separate the data from the proprietary mapping software it used.
So it offered the appraiser an electronic copy for $2,000 — the price to extract the data from the proprietary system.
That’s actually fair game since most open records laws stipulate that the government agency can charge you the cost of copying the information. But it’s also ridiculous that someone should have to pay that much for information the government collected at taxpayers’ expense.
“This is why the use of open formats is one of the Open Data Principles that we spend so much time talking about,” the Sunlight Foundation writes. “Government should buy the software it needs, but openness has to be a consideration when making those purchasing decisions.”
It’s not just a pain for the government — across the board, a lack of standards raises serious barriers to Open Data, and by extension, the potential of Linked Data. Even once standards are created, there’s the challenge of applying the standard. For an example of the resulting headaches, check out this SemanticWeb.com article on the Clinical Data Interchange Standards Consortium (CDISC) and its work with clinical trial standards.