Author: Inioluwa Deborah Raji
Publisher: Patterns
Publication Year: 2020
Summary: The following article begins by confronting the idea that data science aims to reduce people to one dimensional ideas or stereotypes. When individuals or organizations attempt to compress groups into a model or algorithm, this can create or perpetuate stereotypes. These reductions of individuals can have harmful effects. The article continues to address how the “defaults” in which data is defined can reinforce racism and discrimination. For example, searching the internet for “unprofessional hairstyles” directs users to images of Black women in natural hairstyles, while “professional hairstyles” directs users to images of white individuals. In many data scenarios, white men are seen as the default. Many datasets in the computer vision field originate in the United States and United Kingdom. This lack of data can reinforce harmful stereotypes and present them as reality. The article closes, by recommending data scientists transition from practices of collecting data easily to collecting data with carefulness and intention to which many data scientists are not accustomed. Data science and collection must continue to grow in order to reflect this ever-changing world.