8. mars 2024

Developing discrimination machines? Cross-sectoral discussion on gender inclusive AI

On International Women’s Day, Ghana and Norway as co-champions of the UNFPA Equity 2030 Alliance, together with UN Agencies and Women at the Table organized a panel discussion to explore challenges and best practice in developing gender inclusive approaches to artificial intelligence.

Representatives from health, human rights, humanitarian and labour organizations participated in a cross-sectoral expert panel, moderated by Women at the Table’s Caitlin Kraft-Buchman. 

– If AI sometimes is perceived as a gold mine, gender is one of the canaries in that mine, said Norway’s Ambassador Tormod Endresen. – We all play roles in ensuring that instead of making discrimination machines, we develop tools for inclusion. 

UN Women’s Helene Molinier presented a new position paper released by the Coalition on Technology and Innovation for Gender Equality called Placing Gender Equality at the Heart of the Global Digital Compact. Molinier emphasized that AI can generate new forms of social vulnerabilities and now is the time to cover gaps in digital governance with the negotiation of the Global Digital Compact. Prioritizing gender perspectives in AI is instrumental not just because it's going to be beneficial for women and girls, but because it would set countries on a path for a more inclusive future for everyone, she said. 

Invisible in data sets  

Ishita Barua, co-founder and Chief Medical Officer of the start-up LIVV Health took an optimistic view and emphasized that AI tools have large potentials for providing meaningful service. But it is key that all individuals own their own data. 

Data is at the heart of everything and has become the glue that binds us all together, said Steve MacFeely from WHO. He pointed out that good data governance is a prerequisite for good AI governance: don’t get distracted by the noise, focus on the signals; don’t forget about the data! 

MacFeely highlighted that gender digital divides can arise when women do not own or have access to phones or are not registered at birth. This creates a data divide as they become invisible in data sets and therefore underrepresented in analyses and models. This ‘hard codes’ and reinforces gender stereotypes in the models that we are building. 

Exacerbating stereotypes 

The human rights implications include that as human prejudice and stereotypes are reproduced, access to essential services such as health services can be prevented, said OHCHR’s Scott Campbell. Discussions must be grounded in human rights frameworks with gender equality and non-discrimination as fundamental principles. 

Hovig Etyemezian and Rebeca Moreno Jimenez from UNHCR emphasized that misinformation and disinformation challenges become worse with auto-generation of content in humanitarian contexts. AI can exacerbate the stereotypes of refugee women and when technology goes unsupervised it can affect people who are already targeted. They also highlighted that working closely with refugees and displaced communities can certainly help design AI based solutions that can help solve complex problems whilst ensuring that do no harm principles are implemented.

From a labour perspective, ILO’s Mia Seppo pointed out that overall women are 2.5 times more affected by automatization than men, and more so in high income countries. A badly managed transition could disproportionately harm women. Investment is needed to create opportunities for women to receive training and benefit from new job opportunities which will likely to be technology-driven jobs in sectors where women are at the moment under-represented.  The gender pay gap of over 25% in one out of four countries for science, engineering and ICT professionals needs to be closed and pathway found from informal work to formalised technology-driven jobs, said Seppo. 

8 March 2024

Best practices identified across the sectors included:  

  • Frameworks and solutions anchored in human rights norms can contribute to safer, more inclusive, digital societies. 
  • Make models as transparent as possible  
  • Increase investments in social research on AI. 
  • Audit models and commit to retrain models as data sets are built.  
  • Work towards representation in STEM industries. 
  • Gender inclusive and community/user centric approaches should be baked into the whole AI life cycle, from design to deployment and use of resulting tools.
  • If AI cannot be used in compliance with human rights frameworks it should be banned. Should not be marketed and sold until there are safeguards. 
  • Important to work together with and codesign solutions with civil society and affected populations themselves. 
  • Key to look at gender responsive employment and policies, including effort on STEM careers and inclusive learning environments.

    The UNFPA Equity 2030 Alliance is a global effort to accelerate gender equity in science, technology an financing by 2030. The alliance includes more than 50 members, including technology companies, and works to ensure that women and girls in all their diversity are not only statistically visible, but actively included in the design and development of solutions. 
    Amb Ghana Norway
    Photo:Permanent Mission of Norway
    In his closing remarks, Ghana’s Ambassador Emmanuel Kwame Asiedu Antwi emphasized that unless the course is corrected now, historic inequities will only be reinforced. There are risks and global concerns, such as algorithms work without borders, which means that we must move forward together to solve this immediate problem and grasp the opportunity, he said. This includes working together to place gender firmly in the Global Digital Compact to ensure that these issues are addressed for this and future generations.