Lawmakers Want Pause on Federal Funds for Predictive Policing

Must read

Should data scientists be in the business of fingering Americans for crimes they could commit, someday? Last month, a group of federal lawmakers asked the Department of Justice to stop funding such programs—at least until safeguards can be built in. It’s just the latest battle over a controversial field of law enforcement that seeks to peer into the future to fight crime.

“We write to urge you to halt all Department of Justice (DOJ) grants for predictive policing systems until the DOJ can ensure that grant recipients will not use such systems in ways that have a discriminatory impact,” reads a January letter to Attorney General Merrick Garland from U.S. Sen. Ron Wyden (D–Ore.) and Rep. Yvette Clarke (D–N.Y.), joined by Senators Jeff Merkley (D–Ore.), Alex Padilla, (D–Calif.), Peter Welch (D–Vt.), John Fetterman, (D–Penn.), and Ed Markey (D–Mass.). “Mounting evidence indicates that predictive policing technologies do not reduce crime. Instead, they worsen the unequal treatment of Americans of color by law enforcement.”

The letter emphasizes worries about racial discrimination, but it also raises concerns about accuracy and civil liberties that, since day one, have dogged schemes to address crimes that haven’t yet occurred.

The Rattler is a weekly newsletter from J.D. Tuccille. If you care about government overreach and tangible threats to everyday liberty, this is for you.

Fingering Criminals-To-Be

Criminal justice theorists have long dreamed of stopping crimes before they happen. Crimes prevented mean no victims, costs, or perpetrators to punish. That’s led to proposals for welfare and education programs intended to deter kids from becoming predators. It’s also inspired “predictive policing” efforts that assume crunching numbers can tell you who is prone to prey on others. It’s an intriguing idea, if you ignore the dangers of targeting people for what they might do in the future.

“For years, businesses have used data analysis to anticipate market conditions or industry trends and drive sales strategies,” Beth Pearsall wrote in the Department of Justice’s NIJ Journal in 2010. “Police can use a similar data analysis to help make their work more efficient. The idea is being called ‘predictive policing,’ and some in the field believe it has the potential to transform law enforcement by enabling police to anticipate and prevent crime instead of simply responding to it.”

Interesting. But marketers targeting neighborhoods for home warranty pitches only annoy people when they’re wrong; policing efforts have much higher stakes when they’re flawed or malicious.

“The accuracy of predictive policing programs depends on the accuracy of the information they are fed,” Reason‘s Ronald Bailey noted in 2012. “We should always keep in mind that any new technology that helps the police to better protect citizens can also be used to better oppress them.”

Predictive Policing in (Bad) Action

People worried about the dangers of predictive policing often reference the 2002 movie Minority Report, in which a science-fiction take on the practice is abused to implicate innocent people. Recent years, though, have delivered real-life cautionary tales about misusing data science to torment people for crimes they haven’t committed.

“First the Sheriff’s Office generates lists of people it considers likely to break the law, based on arrest histories, unspecified intelligence and arbitrary decisions by police analysts,” the Tampa Bay Times reported in 2020 of Pasco County, Florida’s predictive policing program. “Then it sends deputies to find and interrogate anyone whose name appears, often without probable cause, a search warrant or evidence of a specific crime.”

In practice, as a former deputy described the program’s treatment of those it targeted: “Make their lives miserable until they move or sue.”

Sue they did, with many plaintiffs represented by the Institute for Justice. Last year, with legal costs mounting, the sheriff’s office claimed in court documents that it discontinued predictive policing efforts.

Garbage In, Garbage Out

A big problem with predictive policing is that it relies heavily on honesty and dispassion in people who create algorithms and enter data. As recent arguments over biases in internet search results and artificial intelligence reveal, the results that come out of a data-driven system are only as good as what goes in.

“One foundational problem with data-driven policing is that it treats information as neutral, ignoring how it can reflect over-policing and historical redlining,” the Brennan Center for Justice’s Ángel Díaz wrote in 2021. He added that tech vendors dealing with the NYPD’s predictive policing program “proposed relying on data such as educational attainment, the availability of public transportation, and the number of health facilities and liquor licenses in a given neighborhood to predict areas of the city where crime was likely to occur.”

Are those real predictors of criminal activity? Maybe. Or maybe they’re excuses for making people’s lives miserable until they move or sue, as happened in Pasco County.

Forecasts Fueled by the Feds

As with so many big ideas with scary potential, impetus for development and implementation comes from government funding and encouragement.

“The National Institute of Justice, the DOJ’s research, development and evaluation arm, regularly provides seed money for grants and pilot projects to test out ideas like predictive policing,” American University law professor Andrew Guthrie Ferguson commented earlier this month. “It was a National Institute of Justice grant that funded the first predictive policing conference in 2009 that launched the idea that past crime data could be run through an algorithm to predict future criminal risk.”

Of course, it’s not bad to seek innovation and to look for new tools that could make the public safer. But hopefully, those funding such research want it to make the world a better place, not worse. And when lawmakers asked the Justice Department in 2022 for some documentation on predictive policing, officials admitted they didn’t really know how money was being spent, let alone its impact.

“It remains an unanswered [question], for example, to what degree such tools are, or ever have ever been, assessed for compliance with civil rights law,” Gizmodo‘s Dell Cameron wrote at the time.

Hence the letter from Wyden and company. After years of haphazard funding and development, warnings from civil libertarians, and abuses by police, some lawmakers want the federal government to stop funding predictive policing efforts until due diligence is done and safeguards are in place.

You have to wonder if predictive policing programs predicted the field’s own current troubles.

More articles

Latest article