Author: Jonathan Hofer
Publisher: Medium
Publication Year: 2020
Summary: The following article focuses on Cincinnati but is applicable nationwide. Their biased policing data is feeding into creating a biased predictive model. A case of “garbage in, garbage out” has repercussions for minority communities. They mention that while they know that the data is biased, the fix is not simple because it is hard to decide how to truly create ethical datasets. They take on the question of if introducing weights can correct for racist policing and if this is problematic that it accepts a certain amount of racism, perhaps making police believe that an algorithm will correct for their actions and feel less need to change.