Gross, Nicole (2023) What ChatGPT Tells Us about Gender: A Cautionary Tale about Performativity and Gender Biases in AI. Social Sciences, 12 (8). ISSN 2076-0760
Preview |
PDF
Download (356kB) | Preview |
Abstract
Large language models and generative AI, such as ChatGPT, have gained influence over people’s personal lives and work since their launch, and are expected to scale even further. While the promises of generative artificial intelligence are compelling, this technology harbors significant biases, including those related to gender. Gender biases create patterns of behavior and stereotypes that put women, men and gender-diverse people at a disadvantage. Gender inequalities and injustices affect society as a whole. As a social practice, gendering is achieved through the repeated citation of rituals, expectations and norms. Shared understandings are often captured in scripts, including those emerging in and from generative AI, which means that gendered views and gender biases get grafted back into social, political and economic life. This paper’s central argument is that large language models work performatively, which means that they perpetuate and perhaps even amplify old and non-inclusive understandings of gender. Examples from ChatGPT are used here to illustrate some gender biases in AI. However, this paper also puts forward that AI can work to mitigate biases and act to ‘undo gender’.
Item Type: | Article |
---|---|
Additional Information: | © 2023 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
Uncontrolled Keywords: | gender; gender bias; ChatGPT; large language models; generative AI; performativity; ethical AI |
Subjects: | Q Science > QH Natural history > QH301 Biology > Methods of research. Technique. Experimental biology > Data processing. Bioinformatics > Artificial intelligence Q Science > Q Science (General) > Self-organizing systems. Conscious automata > Artificial intelligence H Social Sciences > HV Social pathology. Social and public welfare > Discrimination H Social Sciences > HQ The family. Marriage. Woman > Gender |
Divisions: | School of Business > Staff Research and Publications |
Depositing User: | Tamara Malone |
Date Deposited: | 02 Aug 2023 13:38 |
Last Modified: | 02 Aug 2023 13:38 |
URI: | https://norma.ncirl.ie/id/eprint/6770 |
Actions (login required)
View Item |