Author: Carmen Neithammer
Publisher: Forbes
Publication Year: 2020
Summary: The following article touches on the bias that is introduced into algorithms by using data sets that are comprised mainly of information about the male population. A paper released for the European Union on the regulation of artificial intelligence includes requirements for algorithm builders to act against discrimination that is unlawful. Among these discriminations is also a gender discrimination that can be found in many algorithms. This bias is introduced when data scientists and programmers use data sets that do not represent women. This can have drastic impacts for women. For example, the article points out that seat belt tests have mainly been performed on male shaped dummies, thereby making the design of seat belt less secure for women. A gender gap also exists in the medical field, in which women are often excluded from studies, thus producing research results that might not be applicable to women. The technology industry is another example in which a gender gap exists. Lastly, the article also talks about what policies and legislations could be used to support women and in closing the gender gap in many of these industries.