AB 13 seeks to prevent algorithm-driven systems from resulting in discrimination
SACRAMENTO – Today, the Senate Committee on Judiciary approved Assembly Bill (AB) 13, authored by Assemblymember Ed Chau (D–Monterey Park). This bill would bring accountability and transparency to algorithm-driven systems used by public entities, which rely on machine learning or artificial intelligence to make decisions affecting people’s lives.
Specifically, AB 13, titled the Automated Decision System Accountability Act, would create the first statewide algorithmic accountability framework, which sets forth criteria for the procurement of high-risk automated decision systems by government entities in order to minimize the risk of adverse and discriminatory impacts resulting from their design and application.
“Poorly designed algorithm-driven systems can create unfair, biased and inaccurate results, causing disproportionate harm to low-income families and communities of color while also undermining trust in the public sector,” said Assemblymember Chau. “Without clear oversight of these systems by the very government agencies that purchase and use them, we would fail in our responsibility to ensure these systems do not create new harms or result in discriminatory decisions that affect our legal rights.”
According to a 2019 report by The Brookings Institution’s Artificial Intelligence and Emerging Technology Initiative, “algorithmic or automated decision systems use data and statistical analyses to classify people and assess their eligibility for a benefit or penalty.” The application of these systems can assist with credit decisions, employment screening, insurance eligibility, as well as the delivery of government services, criminal justice sentencing, and probation decisions. And, according to a report entitled Algorithmic Bias Explained: How Automated Decision-Making Becomes Automated Discrimination, from The Greenlining Institute, algorithmic bias occurs when an algorithmic decision creates unfair outcomes that unjustifiably and arbitrarily privilege certain groups over others. This matters because algorithms act as gatekeepers to economic opportunity. The Greenlining Institute is the sponsor of the Automated Decision System Accountability Act of 2021.
When it comes to the acquisition of these systems by government agencies, we must examine the state’s procurement policies. A report from the AI Now Institute recommends the adoption of impact assessments by public agencies during the procurement process, to ensure that automated decision systems are more accurate, fair and that potential concerns are addressed before the system goes live and begins to impact the public. The Automated Decisions Systems Accountability Act increases agencies’ internal expertise and capacity to evaluate the systems they procure, helping them avoid public backlash and anticipate concerning issues such as disparate impacts or due process violations.
”We must ensure that our government and public agencies understand the risks and potential impacts when they purchase high-risk algorithms that control access to housing, credit, government services and economic opportunity. We’ve seen time and time again that when these systems fail, communities of color and low-income families bear the brunt of the harm,” said Greenlining Institute Technology Equity Legal Counsel Vinhcent Le., who is “The Greenlining Institute thanks the committee for their support of the Automated Decisions Accountability Act which takes key first steps towards ensuring our government algorithms are fair and unbiased. This is critical if we are to close the racial wealth gap and rebuild eroding public trust in government and technology.”
The Greenlining Institute is the sponsor of the Automated Decision System Accountability Act of 2021.
Assemblymember Ed Chau represents the 49th Assembly District, comprised of the communities of Alhambra, Arcadia, El Monte, Monterey Park, Rosemead, San Gabriel, San Marino, Temple City and portions of Montebello, and South El Monte.