Media Contact

Danielle Bell

SENIOR PROGRAM MANAGER FOR MEDIA RELATIONS

media@greenlining.org danielle.bell@greenlining.org

By Jesse Marx
Voice of San Diego

In 2019, the California Legislature put a stop to the police use of facial recognition. Although law enforcement agencies view the ability to unmask people as a valuable investigative tool, the technology is imperfect. Research suggests that the algorithms are good at identifying White people but less effective when it comes to people of color. Indeed, there’ve been plenty of stories showing how Black men were falsely identified and accused of crimes.

But because the state’s ban was temporary — it began in 2020 and lasts three years — you’re likely to hear a lot more about biometric surveillance in the Capitol going forward. The debate over its usefulness and potential harms will only intensify because it draws on two competing values: privacy and public safety.

Earlier this year, the Greenlining Institute, a progressive advocacy group, released a report about how algorithms were replacing decision-making at all levels of society, not just policing but health care, housing, finance, education and more. The purpose of the report was to provide policymakers with a baseline understanding of how bias infiltrates even the most well-intentioned, seemingly neutral tools.

The group is advocating for policies — including AB 13, currently up for consideration — that seek to mitigate what’s known as algorithmic discrimination. Another group from California is now suing a facial recognition app for allegedly stockpiling data on 3 billion people without their knowledge or permission. The company, which offered its services to some San Diego agencies, contends that its technology is not racially biased and will reduce rather than increase the likelihood of wrongful arrest.

Part of the issue is the quasi-religious faith that officials place in the digital authority of computer programming to see the things that we, as mere mortals, can’t see. In 2018, the Little Hoover Commission warned that California, though home to Silicon Valley, was falling to prepare for a future dominated by artificial intelligence, one that might, say, predict where a wildfire will occur.

Transparency in this space is increasingly important. The interest in facial recognition extends well beyond California and hits close to home.

As I reported earlier this week, the U.S. House of Representatives’ Committee on Oversight and Reform once reached out to San Diego with a request for documentation about the city’s use of facial recognition, but the picture it got was less than complete. The response was missing a number of key documents expressing internal concern over some of the same issues I described above, as early as 2011.