Dear All,
As a sponsor of this year Women's Forum For The Economy & Society (WFES), FTI along with other WFES partners will be carrying out an in-depth research project on addressing gender bias in AI. We would like to explore the potential interest some member of the AI Alliance may have in participating in this task. No fund contribution is required. Should you be interested please let us know. We are hoping to finalize the list of external experts participating in this task by the end of the month.
Formal institutional partners:
Role:
• The role of the Institutional Partner would be to lend their expertise to inform, challenge and validate the framing of our research.
• In this capacity, Institutional Partners provide a critical and independent voice as well as a non-private sector lens.
• Finally, we hope that our Institutional Partners will help amplify the work and the outcomes with their stakeholders, acting as a spokesperson at the Women's Forum, and possibly other events.
• The Institutional Partner is not asked to commit any resources other than their expertise and what we expect will be a reasonable time commitment.
Benefits: • Appear as an institutional partner in all content produced related to it. • Be invited to attend and in some cases speak at the meetings of the Women’s Forum where the research is launched and profiled.
Subject matter experts
Role: • Act as a sounding board to the research framing, methodology definition and proposed outcomes as an expert in the ecosystem. • Advise of related initiatives and work being carried out in the space
• Be invited to attend the kick-off meeting (in NY – July 27) to provide initial thoughts and comments • Opportunity to get more involved at different stages of the process depending on area of expertise - e.g. research articulation, or content amplification
Benefits: • Exchange with peers on issues of common interest. • When appropriate, their contribution will be acknowledged in the final output.
- Log in to post comments
- Clibeanna
- AI Bias Gender bias WFES
Comments

In reply to Dear Emmanouil, by Malay Upadhyay

I will send you shortly a private message with further details thanks!
- Log in to post comments

That reminds of this story that probably everybody has heard by now: In the Turkish language, there is one pronoun, “o,” that covers every kind of singular third person. Whether it’s a he, a she, or an it, it’s an “o.” That’s not the case in English.
So when some translation software goes from Turkish to English, it just has to guess whether “o” means he, she, or it. And those translations reveal the algorithm’s gender bias.
So "he is a nurse, she is a doctor" becomes "O bir hemşire, o bir doktor". When translating back into English, this becomes "She is a nurse, he is a doctor".
I guess most translation engines have fixed this bug by now.
- Log in to post comments

Hi,
Your project is just in line with the EC activities in the domain. Besides encouraging more women to be involved in the AI algorithms development, we are also starting to invest more and more in querying the outcomes of the decisions made by algorithms and shed more light on the process that leads to them. Here GDPR is quite helpful with broadening the definition of profiling activities, which should give us greater opportunities to question and combat profiling-based decisions including gender-based ones.
You also could consider sharing the info of your project with people from the European Network for Women in Digital https://ec.europa.eu/digital-single-market/en/european-network-women-di…
- Log in to post comments

Thank You Mariana, Would you be interested in getting involved as well with WFES?
- Log in to post comments
In reply to Thank You Mariana, Would you by Emmanouil PATAVOS

I will forward your info to colleagues active in the domain with a view of possible future synergies.
- Log in to post comments

Dear Emmanouil,
Thank you for sharing the details of this interesting initiative. I think it is vital to apply a holitic approach to 'gender bias', as Mariana Popova has already stressed below. Both 'gender' and 'bias' have multiple meanings in multiple domains, and a productive debate about 'gender bias' in AI needs to be reflexive of that, as well as specific. The dominance of male individuals in tech is just one, albeit crucial, aspect of 'gender bias' in AI innovation. The make-up of training data (both visual and linguistic) is another (here, I would recommend building on this key article on 'debiasing word embeddings' https://arxiv.org/abs/1607.06520). Both are grounded in wider structural inequalities that show now only in AI innovation, but in many industries and parts of society. It is crucial to make the links between structural and contextual inquality and 'bias' explicit to devise more inclusive policies moving forward. Good luck with your work!
- Log in to post comments
In reply to Dear Emmanouil, by Mona Sloane

Thank you Mona,
Would you be interested in participating as well as an external expert?
All the best
- Log in to post comments
Dear Emmanouil,
It is fantastic to hear of the research initiative around gender bias in AI. This bias is being deeply felt among the sources responsible for devising AI solutions. I believe it is also important to have male voices in this initiative and forum simply because diversity and inclusion cannot be produced through the attempts of one-half of the population alone. As such, I’d certainly like to be part of this project. Can you please share further details on next steps and further requirements?
There are three specific problems that are resisting diversity in AI, among others, that I have encountered in my time spreading awareness on AI among the corporate executives. I shared my thoughts around this in a different thread but allow me to elaborate further here:
I hope we can together make a valuable contribution on this front and look forward to updates on the WFES research.
Regards
Malay