Women's Forum For The Economy & Society (WFES) Research into Gender Bias

Dear All,

As a sponsor of this year Women's Forum For The Economy & Society (WFES), FTI along with other WFES partners will be carrying out an in-depth research project on addressing gender bias in AI.  We would like to explore the potential interest some member of the AI Alliance may have in participating in this task. No fund contribution is required. Should you be interested please let us know. We are hoping to finalize the list of external experts participating in this task by the end of the month.

Formal institutional partners:


• The role of the Institutional Partner would be to lend their expertise to inform, challenge and validate the framing of our research.

• In this capacity, Institutional Partners provide a critical and independent voice as well as a non-private sector lens.

• Finally, we hope that our Institutional Partners will help amplify the work and the outcomes with their stakeholders, acting as a spokesperson at the Women's Forum, and possibly other events.

• The Institutional Partner is not asked to commit any resources other than their expertise and what we expect will be a reasonable time commitment.

Benefits: • Appear as an institutional partner in all content produced related to it. • Be invited to attend and in some cases speak at the meetings of the Women’s Forum where the research is launched and profiled.

Subject matter experts

Role: • Act as a sounding board to the research framing, methodology definition and proposed outcomes as an expert in the ecosystem. • Advise of related initiatives and work being carried out in the space

• Be invited to attend the kick-off meeting (in NY – July 27) to provide initial thoughts and comments • Opportunity to get more involved at different stages of the process depending on area of expertise - e.g. research articulation, or content amplification

Benefits: • Exchange with peers on issues of common interest. • When appropriate, their contribution will be acknowledged in the final output.

AI Bias Gender bias WFES


Profile picture for user n0029cz9
Curtha isteach ag Malay Upadhyay an Tue, 17/07/2018 - 00:37

Dear Emmanouil,


It is fantastic to hear of the research initiative around gender bias in AI. This bias is being deeply felt among the sources responsible for devising AI solutions. I believe it is also important to have male voices in this initiative and forum simply because diversity and inclusion cannot be produced through the attempts of one-half of the population alone. As such, I’d certainly like to be part of this project. Can you please share further details on next steps and further requirements?


There are three specific problems that are resisting diversity in AI, among others, that I have encountered in my time spreading awareness on AI among the corporate executives. I shared my thoughts around this in a different thread but allow me to elaborate further here:


  1. Funding: Given that only about 2% of all VC funding in US went to female-founded startups in 2017, it is difficult to envisage how innovation can be given a fair chance, no matter how many women engage in this field.
  2. Data: AI depends on historical data and that is currently skewed. Even the most meticulously designed AI today will incorrectly predict that the men are likely to earn more than women, simply due to the skewed salary data at its hands since men have traditionally been the earning workforce across the world for so much of our modern history.
  3. Habit: Initiatives to teach girls to code such as that of Doina Oncel at hEr VOLUTION or of CoderGirl are commendable. However, the problem is not in getting women to pursue careers in STEM alone, but also in ensuring they stay in it. For instance, it was observed that women can be more resistant to a sustained life of coding due to the loneliness and lack of social outreach that often accompany the work of a professional coder over time. 


I hope we can together make a valuable contribution on this front and look forward to updates on the WFES research.




In reply to by Malay Upadhyay

Curtha isteach ag Emmanouil PATAVOS an Tue, 17/07/2018 - 13:39

I will send you shortly a private message with further details thanks!

Curtha isteach ag Bjoern Juretzki an Tue, 17/07/2018 - 18:10

That reminds of this story that probably everybody has heard by now: In the Turkish language, there is one pronoun, “o,” that covers every kind of singular third person. Whether it’s a he, a she, or an it, it’s an “o.” That’s not the case in English.

So when some translation software goes from Turkish to English, it just has to guess whether “o” means he, she, or it. And those translations reveal the algorithm’s gender bias.

So "he is a nurse, she is a doctor" becomes "O bir hemşire, o bir doktor". When translating back into English, this becomes "She is a nurse, he is a doctor".

I guess most translation engines have fixed this bug by now.


Curtha isteach ag Mariana POPOVA an Tue, 24/07/2018 - 15:55


Your project is just in line with the EC activities in the domain. Besides encouraging more women to be involved in the AI algorithms development, we are also starting to invest more and more in querying the outcomes of the decisions made by algorithms and shed more light on the process that leads to them. Here GDPR is quite helpful with broadening the definition of profiling activities, which should give us greater opportunities to question and combat profiling-based decisions including gender-based ones.

You also could consider sharing the info of your project with people from the European Network for Women in Digital https://ec.europa.eu/digital-single-market/en/european-network-women-di…

Curtha isteach ag Emmanouil PATAVOS an Wed, 18/07/2018 - 15:58

Thank You Mariana, Would you be interested in getting involved as well with WFES?

In reply to by Emmanouil PATAVOS

Curtha isteach ag Mariana POPOVA an Wed, 18/07/2018 - 16:34

I will forward your info to colleagues active in the domain with a view of possible future synergies.

Profile picture for user monasloane
Curtha isteach ag Mona Sloane an Mon, 23/07/2018 - 17:26

Dear Emmanouil,

Thank you for sharing the details of this interesting initiative. I think it is vital to apply a holitic approach to 'gender bias', as Mariana Popova has already stressed below. Both 'gender' and 'bias' have multiple meanings in multiple domains, and a productive debate about 'gender bias' in AI needs to be reflexive of that, as well as specific. The dominance of male individuals in tech is just one, albeit crucial, aspect of 'gender bias' in AI innovation. The make-up of training data (both visual and linguistic) is another (here, I would recommend building on this key article on 'debiasing word embeddings' https://arxiv.org/abs/1607.06520). Both are grounded in wider structural inequalities that show now only in AI innovation, but in many industries and parts of society. It is crucial to make the links between structural and contextual inquality and 'bias' explicit to devise more inclusive policies moving forward. Good luck with your work! 

In reply to by Mona Sloane

Curtha isteach ag Emmanouil PATAVOS an Mon, 23/07/2018 - 17:48

Thank you Mona, 

Would you be interested in participating as well as an external expert?

All the best