Media Contact

Danielle Bell

SENIOR PROGRAM MANAGER FOR MEDIA RELATIONS

media@greenlining.org danielle.bell@greenlining.org

Debra Gore-Mann

President and CEO

Read Bio

By Ed Chau and Debra Gore-Mann
CalMatters

You can’t see algorithms, but they can impact huge parts of your life, from seemingly minor things like what video YouTube will queue up next to life-and-death issues such as whether or not you can get a COVID-19 vaccination. It’s time we all had a better idea how algorithms impact us, particularly when the government is using them.

An algorithm is simply a set of rules and instructions used by a computer program to perform a task or solve a problem. While algorithms themselves are coldly mathematical, they are created by humans who, like all of us, can have blind spots, biases or preconceptions. And that can lead to algorithms that make bad decisions or even perpetuate racial and gender bias.

These algorithms feed into an artificial intelligence framework where machine learning makes decisions and predictions from data about people – decisions previously made by people. According to PwC research, artificial intelligence could contribute $15.7 trillion to the global economy by 2030.

The Greenlining Institute recently released an analysis of the problem, titled Algorithmic Bias Explained: How Automated Decision-Making Becomes Automated Discrimination, which included some startling findings. The report reviews a number of incidents that have made it into the media in which algorithms perpetuated discrimination based on race, gender or income – and those reports represent just the tip of the iceberg, because most algorithms operate in the background, unseen and unknown by those whose lives they impact.

Some of the most disturbing reports have involved government programs, including an Arkansas Medicaid algorithm that wrongly cut off medical and nursing home benefits to hundreds of people. Another, used in Detroit, perpetuated old, discriminatory patterns of redlining by channeling community development funding away from the very neighborhoods that needed it most – literally a case of algorithmic redlining.

In February, the New York Times reported serious issues with an algorithm the federal government uses to manage COVID-19 vaccine allocations: “The Tiberius algorithm calculates state vaccine allotments based on data from the American Community Survey, a household poll from the United States Census Bureau that may undercount certain populations – like undocumented immigrants or tribal communities – at risk for the virus.”

Equally concerning, the New York Times quoted researchers and health officials who are frustrated at how little they know about how the Tiberius algorithm decides how many vaccine doses to send where, describing it as “a black box.”

When government makes decisions that affect our daily lives, our communities and potentially even our very survival, those decisions should not be made in a black box.

Debra Gore-Mann

President and CEO

Read Bio