Keywords: Algorithmic Bias, Coded Gaze, Discriminatory Practices, Facial Recognition Systems, Fairness, Machine Learning, Representation in Training Datasets
Author: Joy Buolamwini
Publisher: TED
Publication Year: N/A
Summary: The following speech discusses how algorithmic bias, or the “coded gaze,” can lead to discriminatory practices. Machine learning is being used for facial recognition but a lack of Black and Brown faces in training sets has led to an inability to identify them. Joy suggests that we incorporate 3 inclusionary practices in coding: 1). Using diverse teams, 2). Factoring in fairness as we code, and 2). Coding for equality by “[making] social change a priority and not an afterthought.”