Amazon Scraps Secret AI Recruiting Tool that Showed Bias Against Women

Author: Jeffrey Dastin

Publisher: Reuters

Publication Year: 2018

Summary: The following article discusses how being able to make the hiring process easier or less time-consuming is a goal for most major big companies. Some companies, like Amazon, have used artificial intelligence (AI) methods in order to create a model which is able to indicate which candidates are best for a certain position. One important consideration when using AI methods is to know how a model was trained. The specific problem that Amazon ran into when using its own hiring model was that the model was biased against women. The reason is that the model was trained on resumes of individuals who applied to Amazon and these individuals were primarily male who worked in the tech industry. This meant that the model would favor a male applicant over a female applicant. While this bias was an unintentional result of the model, it serves as a lesson to all data scientists and to all companies wanting to build a hiring model to ease the process. As data scientists should be cautious of any model that are created and used; sometimes they may run into issues that were not expected or issues that were not intentionally created. It is important to train any model which is not biased against a certain group of people.