IE11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

City Is First in Nation to Ban Police Predictive Technology

Santa Cruz's move this week was backed by a coalition of dozens of civil liberty and racial justice groups, including the ACLU of Northern California, the Electronic Frontier Foundation and the Santa Cruz chapter of the NAACP.

After fostering the development of predictive policing technology a decade ago, Santa Cruz has become the first city in the U.S. to approve a ban on its use.

Both predictive policing and facial recognition technologies were barred from use by Santa Cruz police under a closely watched ordinance unanimously approved this week by the City Council. San Francisco, Oakland and several other cities banned police use of facial recognition technology earlier this year, but civil liberties advocates said the predictive policing ban represents the first of its kind nationwide.

Santa Cruz’s ban prohibits police use of both technologies except with explicit approval from the City Council via a resolution based on “findings that the technology and the data that informs the technology meets scientifically validated and peer reviewed research, protects and safe guards the civil rights and liberties of all people, and will not perpetuate bias.”

The city’s move was backed by a coalition of dozens of civil liberty and racial justice groups, including the ACLU of Northern California, the Electronic Frontier Foundation and the Santa Cruz chapter of the NAACP.

It was also backed by Santa Cruz Police Chief Andy Mills, who has said his department ceased use of predictive policing in 2017.

“Predictive policing has been shown over time to put officers in conflict with communities rather than working with communities,” Mills said.

Predictive policing is a relatively new technology that takes in crime data and runs it through an algorithm to predict where crime is most likely to occur in the future. First used by Los Angeles police, the technology was further developed by Santa Cruz police and touted by local officials in the early 2010s.

PredPol, a leading developer of the predictive software, was founded in Santa Cruz in 2012 with the involvement and support of some local officials.

Use of the predictive technology has in some cases correlated with, though not necessarily caused, reductions in reported crime in Santa Cruz and other cities. But its critics claim its use further cements what they call biased police practices that disproportionately impact minority residents, and they say it cloaks those biases under the cover of algorithmic rigor.

Facial recognition technology, meanwhile, has faced even more widespread criticism as evidence of algorithmic bias has emerged. A federal study released in December found that facial recognition algorithms were up to 100 times more likely to misidentify Black and Asian people, compared to white men.

Santa Cruz police have rolled out a number of departmental reforms, including banning choke holds in advance of a statewide ban. Mills said his department is working on policy changes to bar “no knock” search warrants.

The ban represents a striking reversal from a decade earlier, when police and city officials touted results from an experiment with predictive policing that led to the founding of PredPol. Two Santa Cruz County supervisors, Zach Friend and Ryan Coonerty, were involved in getting the company off the ground.

Coonerty, formerly a Santa Cruz mayor, was an early backer who reportedly helped raise more than $1 million in venture funding for PredPol before working as its director of government relations and strategy. Friend worked to test the underlying technology as a crime analyst for the Santa Cruz police department and served on the company’s board.

PrepPol supports the city’s ordinance, according to a letter to the council from CEO Brian MacDonald.

“Given the racial inequalities pervasive throughout American history and society, we as a company support this language,” MacDonald wrote. “In fact, we would even go so far as to recommend that this standard be applied to all technologies adopted by the city of Santa Cruz, whether used for law enforcement purposes or not.”

PredPol uses three data points to feed its algorithm, according to the letter: crime type, location, and date and time. It does not use data from arrests or traffic stops, and it does not make predictions for crimes that “have the possibility of officer-initiated bias” such as drug crimes and prostitution, according to the letter.

©2020 the Santa Cruz Sentinel (Scotts Valley). Distributed by Tribune Content Agency, LLC.