Telecommunications & Technology

Every time you go online, providers and web sites collect data about your online activities and preferences.  Based on your online activity, data companies can identify your ethnicity, age and virtually any other demographic information. These companies aggregate and analyze that data and sell it to banks, insurance companies, marketing firms and other businesses, which use that data to make marketing and business decisions about you—taking away your data privacy for commercial purposes. These decisions can affect your insurance rates, whether you get a loan, and what ads you see online.  Greenlining works to ensure that companies don’t recreate past systems of discrimination through the increasingly prevalent use of these tools. Greenlining makes its voice heard to regulators at agencies like the Consumer Financial Protection Bureau and Federal Communications Commission where we advocate for stronger consumer privacy protections and the non-discriminatory use of big data.

Digital Redlining

Redlining, the now-illegal practice of denying services and investment to communities of color, is moving online as companies now have access to increasingly intimate and accurate information about us. Lack of data privacy allows companies to track, analyze and sell information about our race, age, ethnicity, location, purchasing history and more. This information can then be used to determine not only what products to market to specific individuals, but also where police patrol more heavily or how likely you might be to commit a crime. These practices produce unfair results because data often reflect systemic biases, or the algorithms that process that data can mirror the implicit or explicit biases of their creators.  For example, some police departments have a history of targeting communities of color for arrest and prosecution. Accordingly, police records from those departments will show higher arrests of people of color. An algorithm processing that data might conclude that the higher number of arrests demonstrates that the police should increase patrols in communities of color even further, perpetuating this institutional bias and leading to manifestly unjust results.  Additionally, companies can use data about location and browsing history as an increasingly accurate proxy for race to sidestep traditional civil rights protections. When used carelessly, data builds inequity into systems and enables redlining at much larger scale and speed.

Greenlining supports several core principles to protect data privacy and prevent digital redlining:

  • Ensure Data Privacy – Consumers should have control over their data and the ability to choose who may view or purchase that data.
  • Data Audits – Companies that use data analytics to provide or block access to economic opportunity — for example, insurers, law enforcement, financial institutions and employers — should undergo regular audits to ensure their algorithms do not disproportionately target or exclude people of color.
  • Transparency – Algorithms process data to make decisions about us. However, these processes are often opaque. Individuals should have access to the information collected about them and be able to correct data points that are incomplete or incorrect. Additionally, without transparency, we have no way of determining whether people are being denied economic opportunities based on illegitimate or irrelevant data—for example, evaluating a consumer’s creditworthiness based on where their neighbors shop. We need strong transparency protections to ensure that companies using data properly and without bias.
  • Diversity – Communities of color are underrepresented in the tech sector and this creates blind spots in how companies design algorithms and use data. Greenlining envisions a future where companies hire diverse, representative employees in order to eliminate both overt and hidden bias as they design and use data analytics.

Access to Economic Opportunity

Economic empowerment requires access to things like credit, employment and insurance. The price of a credit line or auto insurance, or whether a job applicant receives an interview, is often based on data that banks, employers, and insurance companies gather and the predictions their algorithms make about you. Greenlining works with stakeholders like the Consumer Financial Protection Bureau to help influence rules about how data should be used and to ensure they include the safeguards necessary to protect data privacy and stop discrimination.

Mass Surveillance

With big data and new technologies, law enforcement agencies have access to new tools that can exacerbate harms in communities of color. Law enforcement often disproportionately aims tools like predictive policing algorithms, facial recognition, Stingrays and targeting databases at social movements, Muslims, immigrants, and black communities. Greenlining, together with other ally organizations, advocates for laws and practices that protect communities of color from even more intrusive mass surveillance by public and private entities.

Share this: