AI Facial Recognition Systems Work the Worst for Black Women

Author: Dianna Mazzone

Publisher: Allure

Publication Year: 2022

Summary: The following article describes how Dr. Joy Buolamwini, executive director of The Algorithmic Justice League, has been working to fight against racial and gender bias in facial recognition systems. It is prevalent for the systems to be unable to detect the faces of individuals with darker skin tones, and they work the worst for darker-skinned women in particular. When the faces of these women are recognized, they are often detected as male. This algorithmic injustice has considerable implications in many areas. For the criminal justice system, false facial recognition matches impact marginalized communities the most. In the beauty space, artificial intelligence for product recommendations has an underlying bias toward Eurocentric beauty standards. Computers are not neutral. When systems use history to train such systems, the future reflects the racist, sexist past. Questioning bias should be part of the conversation from the beginning of making these systems.