By Jaivon Grant
More than 100 national and local civil rights groups favoring the end of the money bail system have expressed serious concerns with the adoption of an algorithmic-based decision making tool as a substitution for money bail.
Groups including the American Civil Liberties Union, the NAACP Legal Defense and Education Fund, MoveOn, The Greenlining Institute, Media Mobilizing Project, and Color of Change wrote that algorithm risk assessment tools are not the solution to reforming bail systems, “and that, in fact, these tools can worsen racial disparities and allow further incarceration.”
In recent years, algorithmic-based tools have raised serious concerns about privacy, racial prejudice and potential violations of a person’s constitutional rights.
The ACLU of Northern California (ACLU-NC) recently obtained documents about Amazon’s Rekognition program, a new real-time facial recognition service used on public streets for surveillance and policing. After reviewing the ACLU-NC documents, Russell Brandom with The Verge, found that while police typically use facial recognition to look for subjects, “white subjects are consistently less likely to generate false matches than black subjects, a bias that’s been found across a number of algorithms. The bias seems to come from the data used to train the algorithm, which often skews white and male.”
The same is true for the Transportation Security Administration algorithmic-based program called “Quiet Skies.” Quiet Skies specifically targets travelers who “are not under investigation by any agency and are not in the Terrorist Screening Data Base,” but whose “travel patterns or behaviors match those of known or suspected terrorists, or people ‘possibly affiliated’ with someone on a watch list.”
According to a recent article in the Boston Globe, some of the suspicious behaviors listed in Quiet Skies include:
- Reversing or changing directions and/or stopping while in transit through the airport;
- Observing the boarding gate area from afar;
- Boarding last;
- Rapid eye blinking;
- Having an “Adam’s apple jump”
- “White knuckling” bags;
- Having an appearance different from the ID provided
- Sleeping during most of the flight; and
- Sleeping only briefly.
Hugh Handeyside, senior staff attorney with the American Civil Liberties Union’s National Security Project stated that, “These revelations raise profound concerns about whether TSA is conducting pervasive surveillance of travelers without any suspicion of actual wrongdoing. If TSA is using proxies for race or religion to single out travelers for surveillance, that could violate the travelers’ constitutional rights.”
Algorithmic-based decision making tools also are being promoted as a way to forecast an individual’s likelihood of appearance at future court dates and/or risk of re-arrest. The coalition of civil rights organizations has expressed its concern about the lack of transparency and states it will worsen racial disparities and escalate injustices for people of color.
This is why California lawmakers will be faced with a very important decision this August as they debate and vote on Senate Bill 10, legislation by State Senator Bob Hertzberg (D-Van Nuys) and Assemblyman Rob Bonta (D-Oakland) that seeks to end California’s money bail system and replace it with an algorithmic-based tool.
The local and national civil rights groups have long sought to end the money bail system, and SB 10 is the closest they have come to doing so to date. But Sen. Hertzberg and Asm. Bonta may have a very difficult time persuading enough of their fellow Democrat colleagues to support an algorithm-based bail program in the face of such compelling opposition by a powerful coalition of civil rights organizations opposing it.