How Our Data Encodes Systematic Racism

Author: Deborah Raji

Publisher: MIT Technology Review

Publication Year: 2020

Summary: The following article describes how data nearly always lies, especially within policing systems. White supremacy often appears more violently in the news as gunshots at Walmart or church – but it can take a more subtle form in data. Tolerance is exposed by people building artificial intelligence (AI) systems to allow racist ways in how we collect, define, and use data. Non-white people are the norm, not the outliers, so data should represent that. There is corrupt data in the policing system as even the worst-behaving police departments are still being used to inform predictive policing tools. The article reinforces how data is created by us and how we have control over it. The machine-learning community affects a specific level of problems if it only affects specific groups and this needs a specific fight against systematic oppression.