Could Your Big Data Project Put the Company in Legal Jeopardy?

Loraine Lawson
Slide Show

Four Steps to Ensure Your Big Data Investment Pays Off

The big buzz right now is about the contents of the President’s Council of Advisors on Science and Technology (PCAST) report, “Big Data: A Technological Perspective.” The report looks at the technical aspects of Big Data technology and assesses how these capabilities might affect privacy for U.S. citizens here and abroad.

The report examines Big Data capabilities and technical considerations such as data from sensors, data mining, data fusion and information integration, image and speech recognition, social media data, the cloud and encryption. Then, it discusses notice and consent laws in light of these new capabilities and reviews potential tools and solutions for protecting privacy. It’s a short list.

It would be tempting to think of this report as a policy problem for the lawyers or a mere curiosity. Do not be so hasty—it actually offers a valuable implementation lesson for CIOs and other technologists.


Jeffrey Kelly, principal research contributor at The Wikibon project, points out in his Silicon Angle piece how it uses the City of Boston’s Street Bump mobile app as an example of Big Data’s unintended consequences.

The app leveraged smartphone sensors to note when a driver hit a pothole. The information was then sent to the city’s public works department, which would send someone to fix it.

The problem? The app created a heavy bias toward fixing street problems in wealthy areas, where smartphones are more common.

Now, at first glance, that may seem like no big deal to you. Maybe you want wealthier clients or maybe you think, “We’re not the government so that kind of issue isn’t a big deal for us.”

Think again. As Silicon Angle points out, ignoring that kind of bias could lead to discriminatory practices, which in turn could lead to lawsuits, even for private companies.

“But if such a program (unintentionally or not) results in price discrimination based on race, sex, class or some other factor proscribed by law, the retailer could (and should) face legal consequences,” Silicon Angle reports. “It is the retailer’s responsibility to identify such discriminatory practices and put a stop to them even if they are the result of just one of thousands of Big Data analytics projects.”

Fortunately, the report also offers a relatively simple fix: Boston managed to identify the unintended bias by testing the app internally before rolling out the app.

Identifying these sorts of potential legal minefields “requires not just the technology and manpower necessary to identify such discriminatory practices, but the mindset and will to do so as well,” writes Kelly.

The full PCAST report is available online, but if you’d rather not read all 76 pages, you can instead peruse the White House fact sheet. A summary of the White House’s (statistically not representative) citizen poll about Big Data and privacy also provides further insight.

You can also check out Information Week Government’s news coverage about the report, which includes what I am calling a knee-jerk opinion from the U.S. Chamber of Commerce about “overregulation.”  That comment seems naive, at best, especially after reading GigaOm’s assessment.



Add Comment      Leave a comment on this blog post

Post a comment

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

 

Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.


 

Resource centers

Business Intelligence

Business performance information for strategic and operational decision-making

SOA

SOA uses interoperable services grouped around business processes to ease data integration

Data Warehousing

Data warehousing helps companies make sense of their operational data