Public sector AI can transform government but also risks compounding biases against vulnerable groups and deepen existing inequities - AB 13 can prevent this.
The public sector increasingly uses automated systems to make decisions and as a way to improve efficiency. However, poorly designed automated decisions systems (ADS) can create unfair, biased and inaccurate results, causing disproportionate harm to low-income families and communities of color while also undermining their trust in the public sector:
- Unemployment Benefits: Michigan purchased a $47m unemployment algorithm that wrongly accused over 40,000 residents of fraud causing many to go bankrupt and lose their homes and jobs. Michigan had to repay millions and faces multiple lawsuits.
- Public Health Care: When Arkansas implemented a Medicaid access algorithm, thousands of people saw their benefits cut—losing access to home care, nursing visits and medical treatments due to miscoding and incorrect calculations. The state was sued in part because there was no effective way to appeal these algorithmic decisions.
- Criminal Justice: A criminal risk assessment algorithm used across the country was found to falsely label African-Americans and Hispanics as high-risk much more often than caucasians. The system also leads to unfair sentencing for women. The algorithm was 65% accurate in its predictions whereas a group of random people were 67% accurate.
AB-13 will help ensure that government algorithms making "high-risk" decisions are fair, free from unfair bias and actually work as advertised. AB-13 does this by encouraging developers and agencies to complete an impact assessment before purchasing and deploying a high-risk automated decision system. These impact assessments will encourage state agencies to purchase algorithms that have been tested for bias and can provide clear explanations behind an automated decision. The use of AI systems in government is only growing and building public trust and support in these systems is essential, making it increasingly important to ensure that these systems are fair and work as intended. AB 13 will also require California to create an inventory of existing high-risk automated decision systems in use by state agencies.