Exposing Gender Bias in Artificial Intelligence: Towards a more Equitable and inclusive technology

Imagine a world where every time you submit a job application, the job platform doesn't suggest opportunities based on your qualifications but rather on stereotypes. What’s frightening is that this isn't a dystopian fantasy—it's a reality shaped by Artificial intelligence (AI). In this blog post, we'll delve into gender biases in AI, understand how these biases arise, their consequences in the real world, and what we can do to create fairer and more inclusive technologies.

AI has the potential to revolutionise many industries and improve people's lives in many ways. However, one of the main challenges in developing and deploying AI systems is the presence of biases (Ferrara,  2023; Hall and Ellis, 2023). AI appears neutral and lacks obvious personal biases or emotions. However, because AI systems are developed and trained by humans, they inevitably inherit and reflect various human biases, including gender bias. This viewpoint is supported by numerous studies. Research by Smith and Rustagi (2021) found that among 133 analyzed AI systems, approximately 44% exhibited gender bias, and 25% showed both gender and racial bias. Additionally, West, Whittaker, and Crawford (2019) revealed that AI-based resume services from companies like Amazon and LinkedIn predominantly sent IT job vacancies to male job seekers. AI often perpetuates gender stereotypes by associating certain roles and qualities with specific genders (Ferrara,  2023). For instance, if you ask ChatGPT to generate images of a doctor and a nurse, you might receive a male doctor and a female nurse. This further demonstrates the issue of bias within AI systems.

Bias in AI can arise at various stages, including data collection, algorithm design, and user interaction ((Ferrara,  2023). Firstly, many studies suggest that a primary cause of AI gender bias is the lack of diversity in collected data (Wellner, 2020). Leavy et al. (2020) pointed out that the data used for algorithm training often captures societal inequalities and potential discriminatory attitudes, leading to the perpetuation of bias. Secondly, the imbalance in gender representation results in male perspectives dominating the AI development process, where developers and researchers might unintentionally embed their gender biases into algorithms. Recent studies show that only 18% of leaders at major AI conferences are women, and over 80% of AI professors are men (West, Whittaker, and Crawford, 2019). According to the study by Young, Wajcman and Spreje (2021), in the fields of AI and data science, women make up 22% of professionals and are more likely to be employed in lower-status positions. Lastly, gender bias in AI systems affects users through their decisions and actions, and user feedback further reinforces these biases (Wellner, 2020). In this mechanism, biases exhibited by users during interaction with AI systems are learned and reflected by the system, ultimately leading to the accumulation and amplification of gender bias, creating a vicious cycle.

Algorithmic bias causes real harm to marginalized groups, including women (Leavy et al., 2020).  Furthermore, using biased AI systems can undermine public trust in technology, leading to reduced adoption or outright rejection of new technologies.  This can have serious economic and social impacts ((Ferrara,  2023).  Therefore, like all industries, the AI industry must strive for equality in its approaches and perspectives. To achieve true fairness and neutrality, it is essential to actively identify and eliminate these biases at all stages of AI development and application.  Ensuring gender diversity, equity, and inclusion within teams developing and managing AI systems is crucial (Smith and Rustagi, 2021).  Gender-diverse teams can bring different perspectives and experiences, thereby considering a wider range of user needs and potential issues during development, incorporating the voices of marginalized communities, and reducing the occurrence of gender bias. This means the AI field needs more women, and society must provide opportunities for women to access and lead in STEM and ICT education and careers (UN Women, 2024).  Additionally, research by Nadeem, Marjanovic, and Abedin (2021) shows that gender bias in AI is not merely a technical issue, and thus, technical solutions alone may be insufficient.  When making decisions using AI-supported systems, it is crucial to conduct checks and reviews to ensure that gender bias is prevented.

While bias may be an unavoidable aspect of life, we must not allow it to become an inevitable part of new technologies. AI offers us a chance to start anew, but it is up to people, not machines, to eliminate bias. Addressing the implications of biased AI requires the collective effort of all stakeholders, including developers, policymakers, and society at large. By working together to mitigate these risks, I believe the benefits of AI can far outweigh the challenges.

References

Young, E., Wajcman, J. and Sprejer, L., 2021. Where are the women? Mapping the gender job gap in AI.

Smith, G. and Rustagi, I., 2021. When good algorithms go sexist: Why and how to advance AI gender equity. Stanford Social Innovation Review.

Nadeem, A., Marjanovic, O. and Abedin, B., 2021. Gender bias in AI: Implications for managerial practices. In Responsible AI and Analytics for an Ethical and Inclusive Digitized Society: 20th IFIP WG 6.11 Conference on e-Business, e-Services and e-Society, I3E 2021, Galway, Ireland, September 1–3, 2021, Proceedings 20 (pp. 259-270). Springer International Publishing.

Hall, P. and Ellis, D. (2023) ‘A systematic review of socio-technical gender bias in AI algorithms’, Online information review, 47(7), pp. 1264–1279. doi: 10.1108/OIR-08-2021-0452.

West, S.M., Whittaker, M. and Crawford, K., 2019. Discriminating systems. AI Now, pp.1-33.

Ferrara, E., 2023. Fairness and bias in artificial intelligence: A brief survey of sources, impacts, and mitigation strategies. Sci6(1), p.3.

Leavy, S. et al. (2020) ‘Mitigating Gender Bias in Machine Learning Data Sets’, arXiv.org. doi: 10.48550/arxiv.2005.06898.

Wellner, G. P. (2020) ‘When AI is gender-biased: The effects of biased AI on the everyday experiences of women’, Humana.mente, 13(37), pp. 127–150.

Jane Anderson