Why facial recognition tech perpetuates discrimination

September 30, 2021

Image with photograph by Anna Shvets from Pexels

Roxanne De Rossi-Leslie examines how algorithms are prejudiced against BAME communities

Recently I’ve been disturbed by the number of incidents in the news about flawed technology which has been weaponised against ethnic minorities, specifically black communities.

Instead of new technology protecting the general public, it seems to be compounding existing systemic oppression.

What are Ring Doorbells and what role do they play?
Ring Doorbells, owned by Amazon, are home security devices which incorporate cameras, microphones, and motion detection technology. Essentially they are doorbells that allow users to monitor their doorstep remotely.

Owned by the same company and automatically linked to the doorbells, is the neighbourhood watch app ‘Neighbours’, where Ring owners can anonymously post about and share matters of local crime and safety.

The Guardian reports that Ring doorbells have been partnering with the police in both the U.S. and the U.K. for several years now, including running social media campaigns for police departments.

During the height of the Black Lives Matter protests in 2020, L.A. police were reportedly asking Ring owners for footage of protestors, which targeted black people.

People of colour have been disproportionately posted and reported on neighbourhood watch apps for being ‘suspicious’, reports Vice magazine. The Amazon owned security system has hinted towards using facial recognition software in the future, which is notorious for being extremely inaccurate in identifying people of darker skin tones, especially women.

Buolamwini found a facial recognition error rate of 0.8% for light-skinned men and over 34% for dark-skinned women

Computer scientist, Joy Buolamwini, who researches algorithmic bias, found in her MIT research that the facial recognition software she was using would only recognise her face if she wore a white mask. Buolamwini showed facial recognition systems 1,000 faces and found an error rate of 0.8% for light-skinned men and over 34% for dark-skinned women.

Joy and her work featured in the documentary Coded Bias, filmed and directed by Shalini Kantayya. The documentary, see trailer here, includes footage of plain-clothed police stopping and fingerprinting a 14-year-old black boy based on an incorrect match on facial recognition software.

Stop and search is already disproportionately being used against black people, at nine times the rate compared to white people, and this new technology is just further facilitating these already racist institutions.

Coded Bias also tells the story of a landlord from Atlantic Tower Plaza, Brownsville, a predominantly black area of Brooklyn, which uses CCTV to harass its residents. He speaks of his plans to further fortify the building by adding biometric facial recognition, knowing that the technology is largely inaccurate in identifying dark skinned people.

What about algorithms in other institutions?
Wired reports that, “Today’s internet is ruled by algorithms. These mathematical creations determine what you see in your Facebook feed, what movies Netflix recommends to you, and what ads you see in your Gmail.”

Yet, these networks can be prejudiced depending on who developed them, where they were created and how they are employed.

This kind of discrimination is not only prevalent in the police and prison systems but also within healthcare. According to research by Science Magazine, one widely used algorithm was found to be less likely to refer equally ill black patients than white patients to relevant healthcare programmes.

The World Economic Forum investigates how algorithms are even being used to determine bail sentences and jail times in the U.S. And, surprise surprise, black defendants were labelled as more likely to be criminals than white defendants.

[authquote text=”Algorithms have entered virtually every institution, becoming catalysts for systemic racism and oppression”]

We are even seeing this manifest in schools. My father is a supply teacher, and most schools use facial recognition software to take pictures of visitors. Often these pictures will be printed out and worn on a lanyard so that they are visibly registered.

My dad and many of his other black colleagues have experienced instances where the technology hasn’t been able to identify their face to take a picture. There have been many times where it was necessary for a white colleague to stand in the background for the system to register his face.

Algorithms aren’t just what you see on social media but are determining the life trajectory of millions. They have entered virtually every institution, becoming catalysts for systemic racism and oppression.

So why is this happening and who is responsible?
At the end of the day, computers, robots and artificial intelligence all have to come from somewhere. They’re produced by humans with their own opinions and prejudices that they inevitably project onto their creations.

Buolamwini found the images that the facial recognition software was based on, were made up of mostly men with lighter skin. This is how racism occurs within facial recognition and other technology; it is programmed in.

BAME people make up 26% of the UK prison population despite comprising only 13% of the general population and facial recognition is only going to exacerbate the problem. There is a danger that facial recognition will be used as a tool to further oppress ethnic minority groups by the police and by the state.

As we’ve seen, algorithms and technology in every form have the potential to be abused. Humans are never blameless. The engineers who have created these systems and the colossal corporations who profit from them should be held accountable.

Roxanne is a college student studying English Language, Sociology, and Philosophy. She loves reading and writing – as well as learning about society and how its structure affects people. She is always looking to educate herself further and hopes to channel this into her writing in the future.

Other work

Donate via PayPal

Exposure is an award-winning youth communications charity giving young people in north London a voice.

Please support us to continue our work. Thank you.