On Wednesday, we inaugurated Justice Talk, our new monthly online discussion series in partnership with Digg, with a lively debate about the ethics of predictive policing. The conversation covered everything from math intricacies to liability to how the software could prevent domestic violence. One recurring theme: how a supposedly neutral technology can reflect or reinforce current biases — especially when it comes to race.
We’ve pulled out some of the best comments below (edited for length and clarity).
On Racial and Socio-Economic Profiling
Kade Crockford, ACLU of Massachusetts: Predictive policing is built off of historical crime data. And who have police targeted for enforcement on issues like drugs? Inputting that historically biased data into a computer, crunching it through an algorithm, and expecting anything other than instructions to go police more poor, black, and people of color is insanity.
Jeffrey P. Rush: Have to quote Baretta here, "Don't do the crime if you can't do the time." Regardless of race, cops generally arrest those who break the law. We never seem to want to have the conversation about that. Only, it seems, what cops do.
Kade: That's a nice idea, but it's not how policing or punishment works in our society. The drug war is the best example, but there are others. If you're Black or brown, you're much more likely to be arrested for drug use or sales than if you're white.
Michael Schwartz: Jeffrey, I would counter that with cops generally arrest some of those who break the law. When a police force concentrates its presence in specific neighborhoods, over time it will appear that folks in that neighborhood commit crimes at a higher rate than in areas with a smaller police presence.
Jeremy Heffner, Product Manager at Hunchlab: I also think that if we don't want the police to enforce a law, then perhaps we should remove the law. The war on drugs is a key example of a policy that disproportionately affects minorities.
Kade: People of color, particularly black people, are overrepresented at every stage of the criminal punishment system, from stop and frisk, to arrest, to charging, to sentencing, to harsh treatment once incarcerated, and to parole and probation conditions upon release. I found this excerpt from Johann Hari's drug war book particularly insightful on this point.
M Hanora: It's incredibly frustrating in a national movement moment when we're actively confronting police brutality and state violence against people of color that we're simultaneously hearing messages "community policing" and also "predictive policing" — neither of which put any accountability on police, and both of which channel resources and trust to the police while expanding their scope of power.
Maurice Chammah, Staff Writer at The Marshall Project: I think the idea here is to ask about how cops (and the governmental authorities that oversee them) allocate their resources. After all, we all fund the police through our tax dollars. It's not about trying to be overly friendly to criminals or critical of cops; it's about trying to figure out what strategies are going to create the healthiest, safest communities.
On Liability
Andrew G. Ferguson: How are police officers on the street instructed to use the HunchLab technology? Are they told to consider it a tip, a neighborhood profile, or just one of the many factors they can use for reasonable suspicion or probable cause? If they stop someone in one of the predicted areas, how are they told to use the information provided?
Maurice: I found, in St. Louis, that police officers were simply told to consider the areas predicted by HunchLab as "hot spots" — areas that they should visit periodically — but which should not affect reasonable suspicion or probable cause.
Andrew: From another reporter who was embedded with police officers trying out a non-HunchLab technology, I heard the story that the officers were instructed not to mention the technology in their reports, because they did not want the subject to be litigated in any future Fourth Amendment suppression hearing. I am curious to know if anyone has ever litigated the issue.
Maurice: While reporting, I looked in the standard legal databases and couldn't find any examples of litigation yet. But my sense from St. Louis Sgt. Colby Dolly is he's aware this litigation is going to happen and he's told his analysts "be ready to defend this in court."
Andrew: What weight do you think courts (judges/prosecutors) should give to the fact that a particular area was predicted to be the location of a particular type of crime? If the question became an issue in a court case with a suppression hearing how should a judge evaluate the suspicion given by your algorithm?
Maurice: This has haunted other "risk assessment" tools in the criminal justice marketplace.
Andrew: For legal nerds it becomes very interesting, because the Supreme Court has blessed the idea of a "high crime area" as being one of the factors that can justify a stop of a suspect. What predictive policing is doing is creating "micro-high crime areas" which could have constitutional significance.
On Threat Scores
Prasanna Srinivasan: Once predictive policing entails tracking individuals and profiles, how is it different from the "police state" that "the other guys" did during the Cold War? And how does one reconcile this with a free society? I think it's a very thin line.
Kade: In my view, you cannot square these kinds of "threat score" models with the values of a free society. They are fundamentally incompatible. In the overseas war context, the stakes are even higher, where the CIA is reportedly using a kind of predictive analysis to determine whom to execute with drone strikes.
Mark Hansen, Director of Brown Institute for Media Innovation: The data going into most of the models (to date anyway) are not profiling individuals. They range in content from just historical crime statistics (what crime, where and when) to some extra information about the "terrain" or the city.
Maurice: It's also worth bringing up here there are a lot of cases where technology that predicts certain people or households to be at a higher risk of violence has been used to go to people and say "here are resources to turn your life in a different direction." One pastor in St. Louis told me he'd like to see this sort of thing. Here's a long, very well-researched story on Operation Ceasefire, one program that could be tied in with algorithmic prediction.
Andrew: The subject-based (meaning person-based) predictive policing strategies all involve police plus social services. In Chicago and Boston and other places, police have partnered with social services organizations to offer an alternative path. Now, I (perhaps naively) have wondered why one couldn't just use predictive technologies to isolate those most likely to commit crimes and just send in the pastors/social services as opposed to the police. I think I know the answer, but I can put it out as a question.
Eugene Lawson: I do not think the average citizen should ever have a risk level assigned to them. But I think an officer should be able to know immediately when talking to a person if they have had a history of violent crimes or are currently on probation or parole.
On The Efficiency Of Predicting Crime
Kit Kat: How much has crime-predicting software helped police? Has it been as useful as they expected?
Jeremy: Our answer has been that it depends on how the tool is used and what types of tactics are employed.
Kit Kat: Do you personally think the tool is being used in the "right" way?
Jeremy: One example that you may find interesting is in Greensboro NC. In that case some officers were using HunchLab missions and some were not. The officers using HunchLab didn't feel that anything had changed (and they said so), but the crime on the days where they were using HunchLab was lower. More details in their own words.
Mark: That's a great question. It suggests how do you decide that an algorithm like this is doing "the right" thing. We've talked about social values, but also is it actually reducing crime? I mean compared to not using the tool. There should be other "metrics" like whether the patrolling also satisfies or advances other social values.
Jeremy: Personally, I feel that the police departments we are working with are quite mindful of using such tools the right away. An example I will give are the biases present in drug crime data. Police departments know this bias exists and don't want us to model it.
Kit Kat: Do the police come to you when they feel like there could be improvements?
Jeremy: Quite often. We used to select the highest risk areas in a strict sense. Officers gave us feedback that they wished for their daily work to be more varied — areas with crime problems tend to persist across time. Up until that point we hadn't really considered the officer psychology aspect of the application. Keeping officers mentally engaged will likely result in better police work, so we set out to figure out how to incorporate the feedback.
The result is that we no longer select focus areas that are strictly the highest risk. We probabilistically select areas in proportion to risk. This has a lot of advantages. 1) The officers have more varied work day to day and are kept more engaged, 2) We prevent the highest risk areas from being selected every day, every shift which likely reduces the sense of "occupation" within those areas, etc...
Maurice: Unlike HunchLab, I have seen that PredPol — another company, in California — makes much stronger claims about the drop in crime rate due to patrolling with their product. And this has bothered a lot of civil liberties advocates and community members, since it's so focused on patrolling (and not other strategies of the sort Kade and Jeremy have discussed — community engagement, etc.)
Jenna Garcia: Perhaps we as researchers need to go to police. I know there are obvious logistical and safety issues within the research, but perhaps the most authentic way to understand police reform is to become embedded with police and provide a law enforcement/research perspective on these issues. I don't think the obligation lies solely with police to reach out to the community. I have been saying for years that we need to become embedded with police.
Maurice: Absolutely. As a reporter, I kind of felt like I was representing the community (not the black community in St. Louis, for sure, but the nationwide community of people who aren't cops and are trying to understand what policing should entail). And I found a lot of police departments were uninterested in letting me in to learn about their use of predictive policing. St. Louis was the exception.