Author: Joy Buolamwini

Publisher: SpeakInc

Publication Year: 2018

Summary: In the following video, Joy Buolamwini expands on her mission of the โ€˜Coded Gaze,’ her term for algorithmic bias which can lead to discriminatory practices or exclusionary experiences. She recalls her experience of her face only being recognized after she wore a white mask. She provides more examples of how 130 million people in the U.S. have their face in a facial recognition software that can be searched unwarranted with algorithms that havenโ€™t been audited for accuracy. She found that benchmarks used in the performance of facial analysis technology tend to be white males. Therefore, she developed a new dataset that is more improved when it comes to gender parity and having wider representation of skin types. She then set out to answer how accurate are big tech companies at binary gender classification. She lays out the gaps in gender and skin type. In response, IBM made improvements; however, Joy stresses that how models are deployed is just as important as the improvements made. There must be more oversight that the technology is used ethically and that the general public has a better understanding of the social implications of artificial intelligence.