Public sector AI can transform government but also risks compounding biases against vulnerable groups and deepen existing inequities - AB 13 can prevent this.

The public sector increasingly uses automated systems to make decisions and as a way to improve efficiency. However, poorly designed automated decisions systems (ADS) can create unfair, biased and inaccurate results, causing disproportionate harm to low-income families and communities of color while also undermining their trust in the public sector:

AB-13 will help ensure that government algorithms making "high-risk" decisions are fair, free from unfair bias and actually work as advertised. AB-13 does this by encouraging developers and agencies to complete an impact assessment before purchasing and deploying a high-risk automated decision system. These impact assessments will encourage state agencies to purchase algorithms that have been tested for bias and can provide clear explanations behind an automated decision. The use of AI systems in government is only growing and building public trust and support in these systems is essential, making it increasingly important to ensure that these systems are fair and work as intended. AB 13 will also require California to create an inventory of existing high-risk automated decision systems in use by state agencies.

Frequently Asked Questions