Your data discriminates against you.

Your data discriminates against you.

In 2016, Mikhail, Carmen Arroyo's 22-year-old son, regained consciousness after being in a coma for six months. He had been electrocuted on a telephone pole, falling nearly 30 feet and unable to walk, talk, or get around. Arroyo, who was 44 years old at the time, submitted an application to his landlord for permission to live with his son in an apartment in Willimantic, Connecticut. According to court records, the application was immediately denied without explanation, and Mikael was sent to a rehabilitation facility.

Arroyo contacted the Connecticut Fair Housing Center (CFHC), a nonprofit organization that provides free legal services to alleged victims of housing discrimination. In the course of filing a complaint against the landlord, Arroyo and her attorney learned that the landlord did not even know why Arroyo was denied tenancy, that the decision was not made by the landlord, but by an algorithm used by CoreLogic, a software company hired by the landlord to screen prospective tenants Arroyo filed a complaint. After Arroyo filed a complaint, the landlord allowed Mikael to move in with his mother. According to court documents, Arroyo's attorneys continued their investigation and eventually determined the cause of the denial of occupancy. Says CFHC staff attorney Salmun Kazeronian, who represents Arroyo. 'He is now severely disabled, and despite his incapacity to commit a crime, he was blacklisted for housing.'

What happened to the Arroyo family is just one example of how data can lead to discrimination. Automated data systems like CoreLogic use the information collected (sometimes packaged with public data such as DMV and court records, and sometimes with information gleaned from the Internet such as social media activity) to determine whether applicants get jobs, insurance rates, community life-altering decisions, such as how to police them. In theory, these systems are built to eliminate the biases present in human decision-making. In practice, however, they can foster bias. [because] algorithms are often constructed with biased data and do not consider other relevant factors. Because low-income individuals have more contact with government agencies (for benefits like Medicaid), a disproportionate amount of information is fed into these systems. Not only does this data potentially end up in the hands of corporations, but the government itself uses it for surveillance purposes. For example, when UC Berkeley law professor Kiara Bridges interviewed pregnant women applying for prenatal care through Medicaid, she found that they were required to reveal their sexual history and any domestic violence. Says Bridges, "We would talk to pregnant women who came to the clinic just to get prenatal care, only to get a call the next day from Child Protective Services." So says Michelle E. Gilman, a professor of law at the University of Baltimore School of Law. People applying for public benefits are marked as dangerous and may limit their future housing and employment opportunities. Those who are not required to apply for public benefits are exempt from these inequities. [Biased data is used to justify surveillance, creating an endless feedback loop of discrimination In 2012, the Chicago Police Department used predictive analytics, relying primarily on arrest record data, to identify certain individuals who are more likely to commit gun violence or to be victims of gun violence The police have begun to increase their surveillance on these individuals. The program was shelved in 2019 after it proved ineffective in reducing homicides. Civil rights groups noted that the program promoted racial bias. The algorithm is developed from what you give it," said Brandi Collins-Dexter, senior campaign director for the racial justice advocacy group Color of Change. "Give an algorithm garbage and it will give you garbage." Give an algorithm biased information and it will give you biased information in the future.

This reality is at the heart of the Arroyo case: as a Latino, Mikael is one of about one-third of working-age Americans with a criminal record, a disproportionate number of whom are black or Latino. His lawyers are suing CoreLogic, claiming that the company's software reinforces discriminatory policies under the guise of neutrality and efficiency. (If Arroyo wins the case, it would be a small step forward. But unless the U.S. adopts stronger data privacy laws, these life-altering structures, "both powerful and invisible," will go largely unchecked, according to Jennifer Lee of the ACLU of Washington. Jennifer Lee of the ACLU of Washington says that these "powerful and invisible" structures will be largely unstoppable. Until they do, we will continue to be monitored by invisible, discriminatory systems, and minority groups will feel their gaze the most.

Algorithms can use data to make judgments about specific people, like Arroyo, or they can use it to make inferences about entire groups. This can lead to sweeping and discriminatory generalizations: Facebook has been accused of showing different housing and job ads to different users based on race and gender; in 2018, Amazon was found to have selected male candidates over female candidates in its automated hiring tool; and in the same year, the U.S. Department of Labor (DOL) found that the U.S. Department of Health and Human Services (DHHS) had a "bias" against women candidates. In these cases, bias was built into the decision-making tools. According to Collins-Dexter, data points such as a person's taste in music or zip code can be "proxies," even if they are not race- or gender-specific.

This article originally appeared in the Fall 2020 issue of Marie Claire.

Click here to subscribe (opens in new tab)

.

You may also like


Comments

There is no comments