Author: Li Lucy, David Bamman

Publisher: Proceedings of the 3rd Workshop on Narrative Understanding

Publication Year: 2021

Summary: The following article talks about how GPT-3 is a commercially available natural language generator that has been shown to be useful in machine-in-the-loop creative writing. The stories it creates from simple prompts reinforce gender biases, such as men more often being prominent characters as they are in books, while being associated with topics such as politics, sports, and war, while female characters are associated with topics like family and emotions. Gender stereotypes for appearance and power were also prevalent in generated stories. While the algorithm’s nature makes it difficult to say exactly how it has connected these gender stereotypes to generation, it is clear that there is a relationship coming from the stories it is trained on. We must understand that models work with what we give them, and with algorithmic tools like GPT-3, the authors suggest a balance between “creativity and controllability” to avoid unintended biases.