Have you ever felt like you missed the boat on a huge opportunity? I do. A friend once told me to buy bitcoin when it was $80; now it’s almost $5,000 and my stomach clenches thinking about what could have been. I get that same feeling when I think about big data and the opportunity it presents for communities of color. The world’s most valuable resource is now big data; it’s even been called the “new oil.” Machine learning and predictive analytics are the oil rigs and refineries that mine and process data to find valuable business insights. Corporations race to tap into big data because it helps them innovate faster, sell more, track trends, and manipulate public opinion. What’s missing from this conversation is social justice: how big data can be used to both harm and help efforts to bridge America’s widening racial and economic divides.

CLICK TO TWEET: We must ensure that #BigData does not expand racial inequity.

Big Data is the New Oil

Big Data is the New Oil

 

What is Big Data?

In the movie Minority Report¸ which takes place in 2054, the Washington D.C. police department has a “PreCrime” unit of mutant psychics who can predict murders before they occur. That future is here, but instead of psychics we have machines analyzing mountains of data to find patterns and relationships that the human eye can’t see. The insights these programs generate are used to make predictions. Beyond just predicting crime, we have algorithms crunching seemingly arbitrary information like credit card purchases, Facebook likes, Yelp reviews, browsing and search history, text spelling errors, location, or even phone battery life to predict things about you like creditworthiness, what type of TV shows you like, your mental health, insurance risk, job performance, political beliefs, whether you’re poor, undocumented, renting, and even if you’re pregnant. For better or worse, companies and governments use data analytics to answer pretty much any question people can come up with. One team is even analyzing Nazi war records using machine learning to find out who betrayed Anne Frank. However, these algorithmic predictions can make mistakes, and that can have disastrous consequences.

Data Driven Predictions – Expanding Racial Inequity?

Minority Report’s Psychics are Today’s Algorithms

The psychics in Minority Report wrongly predicted the lead character, detective John Anderton, was a murderer, and as much as he knew he was innocent, he was trapped and persecuted in a society that blindly believed in those psychic predictions. The Greenlining Institute focuses on the use of big data because this scenario isn’t just fiction. Incorrect predictions can result in things like unfairly losing a job opportunity, wrong medical treatment, more prison time, paying more for insurance or losing access to credit. As big data use expands, so do the consequences of a wrong prediction. Big data has the potential to reinforce historical patterns of discrimination, but, perhaps more importantly, it can be a powerful tool for increasing economic opportunity.

Algorithms and Data Are Biased

Algorithmic predictions can be off the mark for many reasons, starting with the fact that data and the people who program predictive algorithms can be biased. Bias enters into algorithmic decision-making systems because at the end of the day, the inputs to that system come from people. Like our children, algorithms learn from us and that means we can transmit our implicit or explicit biases to them. As is so often the case, this bias negatively affects people of color. For example, if an algorithm for face recognition or judging beauty is trained with only white faces, that program will be biased towards white women in beauty contests or mistakenly label black faces as gorillas. These types of problems can be addressed through greater diversity and inclusion among the teams that design algorithms and feed it data. A harder problem arises when the data itself reflects systemic bias.

Like in Minority Report, law enforcement agencies increasingly use predictive policing programs and tools that give out “risk scores” to aid in criminal sentencing, and these programs can perpetuate the systemic racial bias inherent in crime data and the justice system. For example, Black and White Americans use drugs at comparable rates, but the imprisonment rate of Black Americans for drug charges is almost six times that of Whites – not because of anything Black Americans do, but because of deep-seated biases in our criminal justice system.

A recent report found that systems using machine learning will “reproduce the inherent biases present in the data they are provided with” and assess disproportionately targeted ethnic and religious minorities as an increased risk. “Acting on these predictions will then result in those individuals being disproportionately targeted by police action, creating a ‘feedback loop’ by which the predicted outcome simply becomes a self-fulfilling prophecy.”

Biased inputs = Biased Outputs

Biased inputs = Biased Outputs

We must prevent such biased outcomes, not just in the criminal justice system but in every sector of our society.  So, we here at Greenlining, along with other nonprofits, tech companies, banks and regulators are beginning to work towards finding the right ethical standards for using this data, protecting privacy, and preventing bias. This is a necessary first step, but only half of the big data picture.

Closing the Racial Wealth Gap with Big Data

At the end of Minority Report, the “PreCrime” unit is shut down because of the serious consequences of a wrong prediction. In the real world, however, the big data genie is out of the bottle and won’t be going away anytime soon. But we don’t have to live with the status quo. While the misuse of data can perpetuate and amplify inequality, we believe big data has even more potential as a tool for positive social impact. It can improve educational outcomes, connect folks to resources like mental health support, loans, and financing; it can help end food deserts, reduce traffic, and improve environmental conditions in polluted neighborhoods. The possibilities are exciting, and as an organization we’re excited to work with our existing partners and to reach out to the tech community to realize that potential while also curbing potential harms. Look to future blog posts in this space to find out how big data can be better leveraged as a tool to advance racial and economic equity.

Vinhcent is Greenlining’s Telecommunications Legal Counsel. Follow him on Twitter @VinhcentLe.