Skip to content
1981
Women in Art and Science
  • ISSN: 1477-965X
  • E-ISSN: 1758-9533

Abstract

Current research in artificial intelligence (AI) sheds light on algorithmic bias embedded in AI systems. The underrepresentation of women in the AI design sector of the tech industry, as well as in training datasets, results in technological products that encode gender bias, reinforce stereotypes and reproduce normative notions of gender and femininity. Biased behaviour is notably reflected in anthropomorphic AI systems, such as personal intelligent assistants (PIAs) and chatbots, that are usually feminized through various design parameters, such as names, voices and traits. Gendering of AI entities, however, is often reduced to the encoding of stereotypical behavioural patterns that perpetuate normative assumptions about the role of women in society. The impact of this behaviour on social life increases, as human-to-(anthropomorphic)machine interactions are mirrored in human-to-human social interactions. This article presents current critical research on AI bias, focusing on anthropomorphic systems. Moreover, it discusses the significance of women’s engagement in AI design and programming, by presenting selected case studies of contemporary female media artists and designers. Finally, it suggests that women, through their creative practice, provide feminist and critical approaches to AI design which are essential for imagining alternative, inclusive, ethic and de-biased futures for anthropomorphic AIs.

Loading

Article metrics loading...

/content/journals/10.1386/tear_00109_1
2024-01-24
2024-06-14
Loading full text...

Full text loading...

References

  1. Bergen, Hilary (2016), ‘“I’d blush if I could”: Digital assistants, disembodied cyborgs and the problem of gender’, Word and Text: A Journal of Literary Studies and Linguistics, 6:1, pp. 95113.
    [Google Scholar]
  2. Birhane, Abeba (2022), ‘The unseen Black faces of AI algorithms’, Nature, 610:7932, pp. 45152.
    [Google Scholar]
  3. Bolukbasi, Tolga, Chang, Kai-Wei, Zou, James Y., Saligrama, Venkatesh and Kalai, Adam T. (2016), ‘Man is to computer programmer as woman is to homemaker? Debiasing word embeddings’, in D. D. Lee, M. Sugiyama, U. V. Luxburg, I. Guyon and R. Garnett (eds), Advances in Neural Information Processing Systems 29: Annual Conference on Neural Information Processing Systems 2016, Barcelona, Spain, 5–10 December, New York: Curran Associates, Inc., pp. 435664, http://papers.nips.cc/paper/6228-man-is-to-computer-programmer-as-woman-is-to-homemaker-debiasing-word-embeddings.pdf. Accessed 7 December 2023.
    [Google Scholar]
  4. Buolamwini, Joy and Gebru, Timnit (2018), ‘Gender shades: Intersectional accuracy disparities in commercial gender classification’, in S. A. Friedler and C. Wilson (eds), Proceedings of Machine Learning Research 81: Conference on Fairness, Accountability and Transparency, New York, USA, 23–24 February, Cambridge, MA: JMLR, pp. 7791, https://proceedings.mlr.press/v81/buolamwini18a.html. Accessed 3 October 2022.
    [Google Scholar]
  5. Carpenter, Julie (2019), ‘Why project Q is more than the world’s first nonbinary voice for technology’, Interactions, 26:6, pp. 5659.
    [Google Scholar]
  6. Carpenter, Julie, Davis, Joan, Erwin-Stewart, Norah, Lee, Tiffany, Bransford, John and Vye, Nancy (2009), ‘Gender representation and humanoid robots designed for domestic use’, International Journal of Social Robotics, 1:3, pp. 26165, https://doi.org/10.1007/s12369-009-0016-4.
    [Google Scholar]
  7. Costa, Pedro (2018), ‘Conversing with personal digital assistants: On gender and artificial intelligence’, Journal of Science and Technology of the Arts, 10:3, pp. 5972, https://doi.org/10.7559/citarj.v10i3.563.
    [Google Scholar]
  8. Costa, Pedro and Ribas, Luísa (2018), ‘Conversations with Eliza: On gender and artificial intelligence’, in A. Rangel, L. Ribas, M. Verdicchio and M. Carvalhais (eds), Proceedings of the 6th Conference on Computation, Communication, Aesthetics and X, Madrid, Spain, 12–13 July, Porto: Universidade do Porto, pp. 10316.
    [Google Scholar]
  9. Cross, Katherine (2016), ‘When robots are an instrument of male desire’, The Establishment, 27 April, https://medium.com/the-establishment/when-robots-are-an-instrument-of-male-desire-ad1567575a3d. Accessed 3 October 2022.
  10. Daily Mail Reporter (2018), ‘Parrot has been placing orders on its owner’s Amazon account by talking to Alexa’, Daily Mail Online, 14 December, https://www.dailymail.co.uk/news/article-6494365/Parrot-placing-orders-owners-Amazon-account-talking-Alexa.html. Accessed 3 October 2022.
    [Google Scholar]
  11. Dastin, Jeffrey (2018), ‘Amazon scraps secret AI recruiting tool that showed bias against women’, Reuters, 11 October, https://www.reuters.com/article/us-amazon-com-jobs-automation-insight-idUSKCN1MK08G. Accessed 3 October 2022.
    [Google Scholar]
  12. Davidson, Thomas, Bhattacharya, Debasmita and Weber, Ingmar (2019), ‘Racial bias in hate speech and abusive language detection datasets’, in Proceedings of the Third Workshop on Abusive Language Online, Florence, Italy, 1 August, Florence: Association for Computational Linguistics, pp. 2535, https://doi.org/10.48550/arXiv.1905.12516.
    [Google Scholar]
  13. Dixon, Lucas, Li, John, Sorensen, Jeffrey, Thain, Nithum and Vasserman, Lucy (2018), ‘Measuring and mitigating unintended bias in text classification’, in Proceedings of the 2018 AAAI/ACM Conference on AI, Ethics, and Society – AIES ’18, New Orleans, USA, 2–3 February, New Orleans, LA: ACM Press, pp. 6773, https://doi.org/10.1145/3278721.3278729.
    [Google Scholar]
  14. Fessler, Leah (2017), ‘Siri, define patriarchy: We tested bots like Siri and Alexa to see who would stand up to sexual harassment’, Quartz, 22 February, https://qz.com/911681/we-tested-apples-siri-amazon-echos-alexa-microsofts-cortana-and-googles-google-home-to-see-which-personal-assistant-bots-stand-up-for-themselves-in-the-face-of-sexual-harassment. Accessed 3 October 2022.
    [Google Scholar]
  15. Ganesh, Maya Indira (2019), ‘AI for the people: AI bias, ethics & the common good’, Disruption Network Lab, YouTube, 27 June, https://www.youtube.com/watch?v=N51IkE7ih48. Accessed 3 October 2022.
  16. Hamidi, Foad, Scheuerman, Morgan Klaus and Branham, Stacy M. (2018), ‘Gender recognition or gender reductionism? The social implications of embedded gender recognition systems’, in Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, Canada, 21–26 April, New York: Association for Computing Machinery, pp. 113, https://doi.org/10.1145/3173574.3173582.
    [Google Scholar]
  17. Hester, Helen (2017), ‘Technology becomes her’, New Vistas, 3:1, pp. 4650.
    [Google Scholar]
  18. Hymas, Charles (2019), ‘AI used for first time in job interviews in UK to find best applicants’, The Telegraph, 27 September, https://www.telegraph.co.uk/news/2019/09/27/ai-facial-recognition-used-first-time-job-interviews-uk-find/. Accessed 3 October 2022.
    [Google Scholar]
  19. Ingold, David and Soper, Spancer (2016), ‘Amazon doesn’t consider the race of its customers. Should it?’, Bloomberg, 21 April, https://www.bloomberg.com/graphics/2016-amazon-same-day/. Accessed 3 October 2022.
  20. Kannabiran, Gopinaath (2011), ‘Themself: Critical analysis of gender in Facebook’, in Feminism and Interaction Design Workshop, CHI 2011 Vancouver, Canada, 7–12 May, New York: Association for Computing Machinery, pp. 16.
    [Google Scholar]
  21. Keyes, Os (2018), ‘The misgendering machines: Trans/HCI implications of automatic gender recognition’, Proceedings of the ACM on Human-Computer Interaction, 2: CSCW, pp. 122, https://doi.org/10.1145/3274357.
    [Google Scholar]
  22. Keyes, Os (2019), ‘On the politics of AI: Fighting injustice & automatic supremacism’, Disruption Network Lab, YouTube, 5 July, https://www.youtube.com/watch?v=5sX6a6jqiRE. Accessed 3 October 2022.
  23. Klare, Brendan F., Burge, Mark J., Klontz, Joshua C., Vorder Bruegge, Richard W. and Jain, Anil K. (2012), ‘Face recognition performance: Role of demographic information’, IEEE Transactions on Information Forensics and Security, 7:6, pp. 1789801, https://doi.org/10.1109/TIFS.2012.2214212.
    [Google Scholar]
  24. Kypraiou, Sofia, Bolón Brun, Natalie, Altés, Natàlia and Barrios, Irene (2021), ‘Wikigender – Exploring linguistic bias in the overview of Wikipedia biographies’, Wiki-Gender.Github.Io, https://wiki-gender.github.io/. Accessed 11 October 2022.
  25. Kyriaki Goni Official Website (2023), ‘Not allowed for algorithmic audiences’, https://kyriakigoni.com/projects/not-allowed-for-algorithmic-audiences. Accessed 18 August 2023.
  26. Lauren Lee McCarthy Official Website (2023a), LAUREN, https://lauren-mccarthy.com/LAUREN. Accessed 18 August 2023.
  27. Lauren Lee McCarthy Official Website (2023b), SOMEONE, https://lauren-mccarthy.com/SOMEONE. Accessed 18 August 2023.
  28. Marie Louise Juul Søndergaard Official Website (2023), AYA, https://mljuul.com/AYA. Accessed 18 August 2023.
  29. McCarthy, Lauren Lee (2018), ‘Feeling at home: Between human and AI’, in L. Kronman and A. Zingerle (eds), The Internet of Other People’s Things: Dealing with the Pathologies of a Digital World, Linz: servus.at, pp. 15559.
    [Google Scholar]
  30. Miner, Adam S., Milstein, Arnold, Schueller, Setphen, Hegde, Roshini, Mangurian, Christina and Linos, Eleni (2016), ‘Smartphone-based conversational agents and responses to questions about mental health, interpersonal violence and physical health’, JAMA Internal Medicine, 176:5, pp. 61925, https://doi.org/10.1001/jamainternmed.2016.0400.
    [Google Scholar]
  31. Nadja Verena Marcin Official Website (2023), ‘Home page’, https://nadjamarcin.com/. Accessed 18 August 2023.
  32. Ni Loideain, Nora and Adams, Rachel (2018), ‘From Alexa to Siri and the GDPR: The Gendering of Virtual Personal Assistants and the Role of EU Data Protection Law’, King’s College London Dickson Poon School of Law Legal Studies Research Paper Series, article first, http://dx.doi.org/10.2139/ssrn.3281807.
    [Google Scholar]
  33. Park, Ji Ho, Shin, Jamin and Fung, Pascale (2018), ‘Reducing gender bias in abusive language detection’, in Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, Brussels, Belgium, 31 October–4 November, Florence: Association for Computational Linguistics, pp. 2799804, https://doi.org/10.18653/v1/D18-1302.
    [Google Scholar]
  34. Peqpez Official Website (2018), ‘Zackie: A PIA for reporters’, 15 November, https://peqpez.net/2018/11/15/zackie-an-ai-assistant-for-reporters/. Accessed 18 August 2023.
  35. Polygreen Culture & Art Initiative (2022), ‘Not Allowed for Algorithmic Audiences, 2021, Kyriaki Goni’, Facebook, 1 February, https://www.facebook.com/watch/?v=636177470926088. Accessed 18 August 2023.
  36. Reddy, Anuradha (2018), ‘Feeling at home with the internet of things’, in L. Kronman and A. Zingerle (eds), The Internet of Other People’s Things. Dealing with the Pathologies of a Digital World, Linz: servus.at, pp. 17181.
    [Google Scholar]
  37. Sheng, Emily, Chang, Kai-Wei, Natarajan, Premkumar and Peng, Nanyun (2019). ‘The woman worked as a babysitter: On biases in language generation’, in Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), Hong Kong, China, 3–7 November, Florence: Association for Computational Linguistics, pp. 340712, https://doi.org/10.18653/v1/d19-1339.
    [Google Scholar]
  38. Søndergaard, Marie Louise Juul and Hansen, Lone Koefoed (2018), ‘Intimate futures: Staying with the trouble of digital personal assistants through design fiction’, in Proceedings of the 2018 Designing Interactive Systems Conference, Hong Kong, China, 9–13 June, New York: Association for Computing Machinery, pp. 86980, https://doi.org/10.1145/3196709.3196766.
    [Google Scholar]
  39. Stack Overflow (2022), ‘Stack overflow developer survey 2022’, https://survey.stackoverflow.co/2022/#developer-profile-demographics. Accessed 3 October 2022.
  40. Statista (2022), ‘Software developer gender distribution worldwide as of 2022’, 2 March, https://www.statista.com/statistics/1126823/worldwide-developer-gender/. Accessed 3 October 2022.
  41. Stephanie Dinkins Official Website (2023), ‘Not the only one’, https://www.stephaniedinkins.com/ntoo.html. Accessed 18 August 2023.
  42. Tandon, Ambika (2021), ‘Practicing feminist principles in AI design’, Feminist AI, https://feministai.pubpub.org/pub/practicing-feminist-principles. Accessed 11 October 2022.
  43. Verena Marcin, Nadja (n.d.), ‘#SOPHYGRAY’, https://files.cargocollective.com/c1036454/Nadja-Verena-Marcin_-SOPHYGRAY.pdf. Accessed 18 August 2023.
  44. Wajcman, Judy (2004), TechnoFeminism, Cambridge: Polity Press.
    [Google Scholar]
  45. Weber, Jutta (2005), ‘Helpless machines and true loving care givers: A feminist critique of recent trends in human-robot interaction’, Journal of Information, Communication and Ethics in Society, 3:4, pp. 20918, https://doi.org/10.1108/14779960580000274.
    [Google Scholar]
  46. Zou, James and Schiebinger, Londa (2018), ‘AI can be sexist and racist: It’s time to make it fair’, Nature, 559, pp. 32426, https://www.nature.com/articles/d41586-018-05707-8. Accessed 3 October 2022.
    [Google Scholar]
/content/journals/10.1386/tear_00109_1
Loading
/content/journals/10.1386/tear_00109_1
Loading

Data & Media loading...

This is a required field
Please enter a valid email address
Approval was a success
Invalid data
An error occurred
Approval was partially successful, following selected items could not be processed due to error