Author: N/A
Publisher: Synced
Publication Year: 2020
Summary: The following news article talks about Genderify, an artificial intelligence (AI)-powered tool designed to identify a person’s gender by analyzing their name, username or email address — has been completely shut down. Genderify was met with fervent criticism on Twitter, with many decrying what they saw as built-in biases. Entering the word “scientist” yielded a 95.7 percent chance of the person being male and only a 4.3 percent chance of the person being female. Models fed with biased data will produce biased predictions, and there is concern that many such flawed models will be turned into applications and released into the market without proper scrutiny. Since AI trained on the existing data, this is an excellent example to show how bias is the data available around us.