IE11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Like SF, Oakland Mulls Facial Recognition Software Ban

An Oakland commission voted recently to support a proposal that would ban the use of facial recognition technology by city departments, following in the footsteps of San Francisco, which is considering a similar ban.

frsoftware.jpg
Oakland is considering banning the use of facial recognition software by city departments, including police, a move that could make it one of the first cities in the country to issue a prohibition on the technology.

Restricting use of the software is necessary, proponents say, because it poses human rights violations and privacy concerns, particularly for people of color.

The San Francisco Board of Supervisors is scheduled to vote Tuesday on a similar policy to ban the use of facial recognition technology and to require board approval for any departments purchasing new surveillance devices.

Oakland’s Privacy Advisory Commission voted unanimously last week to support a proposal that would ban the use of the software. Introduced by Chairman Brian Hofer and Commissioner Heather Patterson, the proposal would amend the city’s current surveillance ordinance and prevent city departments from adopting any facial recognition technology and from using information obtained by the software.

“We have a right to be anonymous in public, to move about and freely associate,” Hofer said. “With facial recognition technology, those rights would be infringed. In the private sector, facial recognition software is being used very fast. We wanted to stop it before it gets out in the wild.”

The proposal is under review by the city attorney and administrator. It will be heard later this month at the Public Safety Committee. If the proposal is passed by the committee, it will be presented to the City Council as early as June.

Johnna Watson, a spokesman for Oakland police, said the department does not use facial recognition software.

Oakland currently has one of the strongest surveillance ordinances in the country. Passed in 2018, it requires that any proposal involving the use of surveillance be heard in a public discussion. City departments are required to produce annual reports on their surveillance technology.

Sameena Usman, government relations coordinator at the Council on American-Islamic Relations in the San Francisco Bay Area, said the Muslim American community nationwide has been subjected to surveillance by law enforcement.

A lawsuit filed by the ACLU in 2010 forced the FBI to turn over more than 50,000 pages of documents that showed the federal agency’s San Francisco agents had taken notes on the religious activities of Muslims from 2004 to 2010.

It’s important to eliminate the possibility that facial recognition software could be used, Usman said, to mend a fragmented relationship between law enforcement and the Muslim community.

“Communities of color, including the American Muslim community, have been concerned over surveillance and being able to practice our First Amendment right to practice our religion openly without having to be concerned of surveillance,” she said. “We’ve seen surveillance in the past of our communities and while we have a robust surveillance ordinance (in Oakland), we want to ensure that facial recognition technology is not included among the technologies used by Oakland police.”

Council Member Noel Gallo said he would support the use of facial recognition software because Oakland police could use all the help they can get — though he would not comment on the current proposal.

“In Oakland, we do have a good amount of crime,” Gallo said. “It’s just a form of public safety since I don’t have the police staffing necessary to protect our children and families. If you do the crime, you should also be clearly identified.”

But Council President Rebecca Kaplan said banning the technology sends an important message to residents that the city is working to prevent the use of any systems that could “unjustly” harm people. She said she would support legislation to ban facial recognition software because of its impact on communities of color.

“Facial recognition systems have been abused and also disproportionately incorrectly identify people of color, putting people at risk of false arrest,” she said.

A study released in January 2018 by the M.I.T. Media Lab found that facial recognition software incorrectly identified up to 35 percent of darker-skinned women. In 2016, Georgetown University estimated that 117 million Americans were in law enforcement facial recognition databases. The study found police facial recognition technology disproportionately affects African Americans.

“These systems will not make us safer,” said Matt Cagle, a technology and civil liberties attorney with the American Civil Liberties Union of Northern California. “By preventing the spread of facial surveillance technology, Oakland gives itself the opportunity to focus on real public safety solutions, like listening to the community, following their surveillance ordinance that requires a community conversation about public safety issues. This presents a real opportunity to work on those relationships.”

If the ban is passed by the City Council, the city’s current surveillance ordinance would be amended to include the ban.

“The concern (with facial recognition) is so great that we are saying, ‘Let’s not do this. Let’s not even go down this road,’” Hofer said.

©2019 the San Francisco Chronicle Distributed by Tribune Content Agency, LLC.