Skip to content
Part of the Law Professor Blogs Network

Can and will “big data” enable astute (and/or scary) recidivism risk assessment?

Download (3)The question in the title of this post is prompted by this interesting Christian Science Monitor article, headlined “Microsoft says its software can tell if you’re going back to prison.” Here are excepts:

In a scenario that seems ripped straight from science fiction, Microsoft says its machine learning software can help law enforcement agencies predict whether an inmate is likely to commit another crime by analyzing his or her prison record.

In a series of videos and events at policing conferences, such as one on Oct. 6 at the Massachusetts Institute of Technology, Microsoft has been quietly marketing its software and cloud computing storage to law enforcement agencies.

It says the software could have several uses, such as allowing departments across the country to analyze social media postings and map them in order to develop a profile for a crime suspect. The push also includes partnerships with law enforcement technology companies, including Taser – the stun gun maker – to provide police with cloud storage for body camera footage that is compliant with federal standards for law enforcement data.

But in a more visionary – or possibly dystopian – approach, the company is also expanding into a growing market for what is often called predictive policing, using data to pinpoint people most likely to be at risk of being involved in future crimes.

These techniques aren’t really new. A predictive approach — preventing crime by understanding who is involved and recognizing patterns in how crimes are committed — builds on efforts dating back to the early 1990s, when the New York City police began using maps and statistics to track areas where crimes occurred most frequently.

“Predictive policing, I think, is kind of a catch-all for using data analysis to estimate what will happen in the future with regard to crime and policing,” says Carter Price, a mathematician at the RAND Corporation in Washington who has studied the technology. “There are some people who think it’s like the movie ‘Minority Report’ ” — in which an elite police unit can predict crimes and make arrests before they occur — “but it’s not. No amount of data is able to give us that type of detail.”

Scholars caution that while data analysis can provide patterns and details about some types of crimes – such as burglary or theft – when it comes to violent crime, such approaches can yield information for police about who is at high risk of violent victimization, not a list of potential offenders.

“Thinking that you do prediction around serious violent crime is empirically inaccurate, and leads to very serious justice issues. But saying, ‘This is a high risk place,’ lets you focus on offering social services,” says David Kennedy, a professor at John Jay College of Criminal Justice. In the 1990s, he pioneered an observation-driven approach that worked with local police in Boston to target violent crime. After identifying small groups of people in particular neighborhoods at high risk of either committing a crime or becoming a victim of violence, the program, Operation Ceasefire, engaged them in meetings with police and community members and presented them with a choice – either accept social services that were offered or face a harsh police response if they committed further crimes. It eventually resulted in a widespread drop in violent crime often referred to as the “Boston Miracle.”…

In one video tutorial for law enforcement agencies, Microsoft makes a sweeping claim. Using records pulled from a database of prison inmates and looking at factors such as whether an inmate is in a gang, his or her participation in prison rehabilitation programs, and how long such programs lasted, its software predicts whether an inmate is likely to commit another crime that ends in a prison sentence. Microsoft says its software is accurate 90 percent of the time.

“The software is not explicitly programmed but patterned after nature,” Jeff King, Microsoft’s principal solutions architect, who focuses on products for state and local governments, says in the video. “The desired outcome is that we’re able to predict the future based on that past experience, and if we don’t have that past experience, we’re able to take those variables and then classify them based on dissimilar attributes.”…

While predictive policing is still in the early stages, some say the data it generates could have a mixed impact. While the information could improve police transparency, it could also lead to other problems. “If police departments had access to social media accounts, and it turned out that crimes were being committed by people who liked a certain kind of music and a certain sports team, it could lead to certain kinds of racial discrepancies,” says Dr. Price, the RAND researcher. “It’s a useful tool, but it should always be done with [the idea of] keeping in mind how this will impact populations differently, and just sort of being cognizant of that when policies are put in place.”

But Kennedy, the criminology professor, says that for violent crimes, using data that shows crime risks to influence policing actions could have devastating consequences. “People have been trying to predict violent crimes using risk factors for generations, and it’s never worked,” he says. “I think the inescapable truth is that, as good as the prediction about people may get, the false positives are going to swamp the actual positives … and if we’re taking criminal action on a overwhelming pool of false positives, we’re going to be doing real injustice and real harm to real people.”