AI discrimination trap? - Recommendations for combating discrimination against women through AI


DOI:
https://doi.org/10.31039/plic.2024.12.266Keywords:
AI bias, gender bias in AI, AI discrimination, algorithmic bias, AI gender gapAbstract
The current overrepresentation of men in the development of AI technologies and the associated one-sided gender perspective is one of the reasons for discrimination against women in AI applications. In this context, AI exacerbates such overrepresentation, thus presenting a new discriminatory risk against women. Today´s omnipresence of AI makes the need for fair algorithms even more crucial. Thus, this study analyzes five application areas of AI (i.e. virtual private assistance, public and private transport and safety, precision medicine, employment area, and credit rating) and discusses potential biases against women identified and proposes anti-discrimination suggestions to overcome this kind of invisibility of women in many AI applications.
References
Armutat, S.; Mauritz, N., Prädikow, L.; Schulte, M.; Wattenberg, M. (2023). Fit für KI? – Genderspezifische Unterschiede in der Wahrnehmung, dem Verständnis und in den Weiterbildungswünschen bezüglich Künstlicher Intelligenz. https://doi.org/10.57720/3734
Asha, A.Z.; Sultana, S.; He, H.; Sharlin, E. (2024). "Shotitwo First!": Unraveling Global South Women's Challenges in Public Transport to Inform Autonomous Vehicle Design. Proceedings of the 2024 ACM Designing Interactive Systems Conference, 3193-3209. https://doi.org/10.1145/3643834.366155
Carnevale, A.; Tangari, EA; Iannone. A; Sartini, E. (2023). Will Big Data and personalized medicine do the gender dimension justice? AI & Society, 38(2), 829-841. https://doi.org/10.1007/s00146-021-01234-9
Chioda, L.; Gertler, P.; Higgins, S.; Medina, P. (2024). Equitable AI Challenge: Improving access to credit with gender-differentiated credit scoring algorithms: Executive Summary. https://www.usaid.gov/sites/default/files/2024-06/AI%20Executive%20Summary_Credit%20Scoring_1_0.pdf
Cirillo, D.; Catuara-Solarz, S.; Morey, C.; Guney, E.; Subirats, L.;Mellino, S.; Gigante, A.A.; Valencia, A.; Rementeria, M.J.; Chadha, A.S.; Mavridis, N. (2020). Sex and gender differences and biases in artificial intelligence for biomedicine and healthcare. NPJ Digital Medicine, 3, 81. https://doi.org/10.1038/s41746-020-0288-5
De Felice, F.; Petrillo, A.; Luca, C.; Baffo, I. (2022). Artificial Intelligence or Augmented Intelligence? Impact on our lives, rights and ethics. Procedia Computer Science, 200, 1846-1856. https://doi.org/10.1016/j.procs.2022.01.385
Demirguç-Künt, A.; Klapper, L.; Singer, D.; Van Oudheusden, P. (2015). The global findex database 2014: Measuring financial inclusion around the world. World Bank Policy Research Working Paper 7255. https://thedocs.worldbank.org/en/doc/681361466184854434-0050022016/original/2014GlobalFindexReportDKSV.pdf
European Union Agency for Fundamental Rights (2022, December 8). Bias in Algorithms – Artificial Intelligence and Discrimination. https://fra.europa.eu/en/publication/2022/bias-algorithm
Hajirasouliha, I.; Elemento, O. (2020). Precision medicine and artificial intelligence: overview and relevance to reproductive medicine. Fertility and Sterility, 114, 908-913. https://doi.org/10.1016/j.fertnstert.2020.09.156
Heinrichs, B. (2022). Discrimination in the age of artificial intelligence. AI & Society, 37, 143–154. https://doi.org/10.1007/s00146-021-01192-2
Hill, T. (2023, October 2). Bias in AI for precision medicine. https://www.reprocell.com/blog/bias-in-ai-for-precision-medicine
Hoffman, S.; Podgurski,A. (2020). Artificial Intelligence and Discrimination in Health Care. Yale Journal of Health Policy, Law, and Ethics, 19 (3), Case Legal Studies Research Paper No. 2020-29. Available at SSRN: https://ssrn.com/abstract=3747737
Iriondo, R. (2018, October 11). Amazon Scraps Secret AI Recruiting Engine that Showed Biases Against Women. https://www.ml.cmu.edu/news/news-archive/2016-2020/2018/october/amazon-scraps-secret-artificial-intelligence-recruiting-engine-that-showed-biases-against-women.html
Kelly, S.; Mirpourian, M. (2021). Algorithmic Bias, Financial Inclusion, and Gender. New York: Women’s World Banking. https://www.womensworldbanking.org/wp-content/uploads/2021/02/2021_Algorithmic_Bias_Report.pdf
Kim, P. (2018). Big Data and Artificial Intelligence: New Challenges for Workplace Equality. University of Louisville Law Review, Forthcoming. https://ssrn.com/abstract=3296521
Lynch, S. (2017, March 11). Andrew Ng: Why AI Is the New Electricity. https://www.gsb.stanford.edu/insights/andrew-ng-why-ai-new-electricity
Manasi, A.; Panchanadeswaran, S.; Sours, E.; Lee, S. (2022). Mirroring the bias: gender and artificial intelligence. Gender, Technology and Development, 26(3), 295–305. https://doi.org/10.1080/09718524.2022.2128254
Niethammer, C. (2020, May 27). AI Bias Could Put Women’s Lives At Risk - A Challenge For Regulators. https://www.forbes.com/sites/carmenniethammer/2020/03/02/ai-bias-could-put-womens-lives-at-riska-challenge-for-regulators/
Ni Loideain, N.;Adams, R. (2018, November 9). From Alexa to Siri and the GDPR: The Gendering of Virtual Personal Assistants and the Role of EU Data Protection Law. King's College London Dickson Poon School of Law Legal Studies Research Paper Series, https://ssrn.com/abstract=3281807 or http://dx.doi.org/10.2139/ssrn.3281807
Ore, O.; Sposato, M. (2022). Opportunities and risks of artificial intelligence in recruitment and selection", International Journal of Organizational Analysis, 30 (6), 1771-1782. https://doi.org/10.1108/IJOA-07-2020-2291
Schuß, M.; Wintersberger, P.; Riener, A. (2021). Security Issues in Shared Automated Mobility Systems: A Feminist HCI Perspective. Multimodal Technologies and Interaction, 5(8), 43. https://doi.org/10.3390/mti5080043
Singh, S.; Kumar, A.; Bose, S. (2024). Behind The Feminine Facade: Gender Bias in Virtual Assistants and Its Effect on Users. Journal of Ecohumanism, 3 (4), 351-356. http://dx.doi.org/10.62754/joe.v3i4.3592
Sutko, D. M. (2020). Theorizing femininity in artificial intelligence: a framework for undoing tech nology’s gender troubles. Cultural Studies, 34(4), 567–592. https://doi.org/10.1080/09502386. 2019.1671469
UNESCO (2024, July 5). Generative AI: UNESCO study reveals alarming evidence of regressive gender stereotypes. https://www.unesco.org/en/articles/generative-ai-unesco-study-reveals-alarming-evidence-regressive-gender-stereotypes
Vicente L; Matute H. (2023). Humans inherit artificial intelligence biases. Scientific Reports, 13(1), article number 15737. https://doi.org/10.1038/s41598-023-42384-8
West, S.M.; Whittaker, M.; Crawford, K. (2019). Discriminating Systems: Gender, Race and Power in AI. AI Now Institute. https://ainowinstitute.org/wp-content/uploads/2023/04/discriminatingsystems.pdf
West, M.; Kraut, R.; Chew, H.E. (2019). I’d blush if I could: closing gender divides in digital skills through education. https://unesdoc.unesco.org/ark:/48223/pf0000367416
Yogeshappa, V. (2024). AI-driven Precision medicine: Revolutionizing personalized treatment plans. International Journal of Computer Engineering and Technology, 15 (5), 455-474, https://doi.org/10.5281/zenodo.13843057
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2024 Christina Schabasser

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
You are free to:
Share: copy and redistribute the material in any medium or format. The licensor cannot revoke these freedoms as long as you follow the license terms. Under the following terms: Attribution-NonCommercial-NoDerivatives-No additional restrictions.
Authors retain copyright and agree to license their articles with a Creative Commons Attribution-NonCommercial-