Should offenders on probation or parole be subjected to closer monitoring by authorities if a software program flags them as being more likely to commit another crime than someone who isn't flagged? Should this predictive software be used to make bail and sentencing recommendations?
Those questions have arisen following a University of Pennsylvania professor's development of software that ostensibly predicts criminal behavior. According to an article on Discovery.com, the professor, Richard Berk, led a team of researchers who created an algorithm that gives law enforcement officials a better chance of identifying future repeat offenders:
Beginning several years ago, the researchers assembled a dataset of more than 60,000 various crimes, including homicides. Using an algorithm they developed, they found a subset of people much more likely to commit homicide when paroled or probated. Instead of finding one murderer in 100, the UPenn researchers could identify eight future murderers out of 100. Berk's software examines roughly two dozen variables, from criminal record to geographic location. The type of crime, and more importantly, the age at which that crime was committed, were two of the most predictive variables.
The software is already being used in Baltimore and Philadelphia to predict which people on probation or parole are most likely to commit murder, and other jurisdictions are eyeing it for a variety of purposes, including making sentencing recommendations. The problem is that the idea smacks of the disturbing plot of the 2002 Tom Cruise film 'Minority Report,' which involved using technology to predict future crimes, and taking the people who were supposedly going to commit those crimes into custody.
Beyond the creepiness factor, the fact that there's so much room for error concerns some security experts, including Bill Stanton, a former NYPD officer and founding partner of QVerity, a company that does training and consulting in the field of deception detection. (Full disclosure: I am also a partner in QVerity, and serve as the company's editorial director.)
In a recent appearance on Fox News, Stanton argued that software that can do no better than identify future murderers with 8 percent accuracy requires some circumspection:
We all want to catch the bad guys. This is a tool, and I want to make sure this tool works properly. Eight out of 100-8 percent? I don't know if I'd want to be a paratrooper with that type of percentage. A lot of questions need to be answered. How heavily are we going to rely on this? Who's writing up these algorithms? The person writing the algorithm has certain biases himself. These are things we have to look at.
Stanton's right. The software could be six times more accurate, and the results would still be a coin toss. Law enforcement budgets are far too tight to waste precious dollars on technology with such a questionable return on investment.