For anyone interested in Gender Data and / for Gender Equality here are the slides I used today at a Seminar at the Human Development Report Office - United Nations Development Programme. Among other points and arguments I made and tried to explain 5 main messages to try and go 'beyond and below' the standard 'Gender Data for Gender Equality' narrative: 1: AI Can Amplify Gender Biases, Stereotypes and Harms 2: Data Can Be Oppressive, Extractive and Dangerous 3: ‘Sex-Disaggregated Data’ Can Negate or Reduce Gender 4: ‘Gender Data’ Should Not Be Reduced to Gender Statistics 5: Gender Data Don’t & Won’t Matter Without Cultural Change And my concluding slide reads as follows: 1) To fight Gender inequality there is no question we do need more/better data, using AI etc…. 2) But there is a ‘Data Dilemma’: a. Data collection on sensitive topics can be dangerous and extractive. b. Datasets and AI models often reflect and may reinforce gender inequalities; AI reproduce and amplify biases. We need to be in tight control. (“Who needs enemies when you have friends like these?”. “Keep your friends close and your enemies closer”) c. More / better data are not always necessary; sometimes gender data gaps are not the real problem and focusing / lamenting on them may be an easy excuse for inaction and distract from too many pain points d. Keep in mind that data as statistics objectify and always (over)simplify the human experience, especially when “gender data” become “gender statistics” and “gender statistics” further become ‘sex-disaggregated”. We need nuance, complexity and creativity! 3) But all of this is a part and a start: it is a process, a project: of social change that should include but go beyond / below “better gender data for better decisions for better lives” 4) Reaching gender equality (and more) armed with data requires a cultural change; a political change, towards greater awareness; greater abilities, greater rights and opportunities for women and girls, men and boys, AND all groups and individuals who want change to be in the loop, have a say…(some call “Data Feminism”, what I call ‘Human AI’). Feel free to share, comment etc etc.
Importance of Sex-Disaggregated Data in AI
Explore top LinkedIn content from expert professionals.
Summary
Sex-disaggregated data in AI means collecting and analyzing information that separates individuals by their sex, which helps us understand how technology impacts men and women differently. Recognizing the importance of this data ensures AI systems are fair, unbiased, and better reflect the realities of diverse groups in society.
- Prioritize inclusive teams: Make it a point to involve people of all genders when designing and building AI systems to avoid bias and represent everyone’s needs.
- Track data gaps: Regularly review your datasets to spot missing or underrepresented groups, then seek out new sources or approaches to fill those gaps.
- Set actionable goals: Establish clear targets for gender representation in both your workforce and your AI outcomes, and create accountability processes to make real progress.
-
-
Removing gender data can worsen AI Bias In 2019 Apple Card was accused of discrimination against women.The company declined a woman’s application for a credit line increase, even though her credit records were better than her husband’s. Meanwhile they granted her husband a credit line that was 20 times higher than hers. The NY State Department found no violations of fair lending since Apple had not used gender data in the development of its algorithms. If Apple had fully adhered to anti-discrimination laws, what led to the paradoxical outcome? A recent research paper explains this paradox. The researchers found that anti-discrimination measures and laws, specifically with respect to the collection and use of sensitive data for ML models can have the opposite effect. The researchers looked at an example data set of a global financial company. They found that all things being equal, women are better borrowers than men, and individuals with more work experience are better borrowers than those with less. Thus, a woman with three years of work experience could be as creditworthy as a man with five years of experience. The data set also showed that women tend to have less work experience than men on average. In addition, the dataset used to train AI algorithms, comprised of information of past borrowers, consisting of about 80 percent men and 20 percent women on average globally. In the absence of gender data, the model treated individuals with the same number of years of experience equally. Since women represent a minority of past borrowers, it is unsurprising that the algorithm would predict the average person to behave like a man rather than a woman. Applicants with five years of experience would be granted credit, while those with three year or less would be denied, regardless of gender. This did not only increase discrimination but also hurt profitability as women with three years of work experience would have been creditworthy enough and should have been issued loans had the algorithm used gender data to differentiate between women and men. The researchers compared the outcomes in jurisdictions like Singapore where gender data can be included and the EU where the collection of gender data is allowed, but not its use in the final model. The researchers also looked at a methodology to create a secondary model to predict the gender of an applicant. This approach increased accuracy to 91% and reduced gender discrimination by almost 70 percent (as well increased profitability by 0.15 percent) This research shows again the importance for companies to understand the deeper workings of the ML algorithms and the linkage to the underlying (training) data. Source https://lnkd.in/daZkrC_x
-
AI systems built without women's voices miss half the world and actively distort reality for everyone. On International Women's Day - and every day - this truth demands our attention. After more than two decades working at the intersection of technological innovation and human rights, I've observed a consistent pattern: systems designed without inclusive input inevitably encode the inequalities of the world we have today, incorporating biases in data, algorithms, and even policy. Building technology that works requires our shared participation as the foundation of effective innovation. The data is sobering: women represent only 30% of the AI workforce and a mere 12% of AI research and development positions according to UNESCO's Gender and AI Outlook. This absence shapes the technology itself. And a UNESCO study on Large Language Models (LLMs) found persistent gender biases - where female names were disproportionately linked to domestic roles, while male names were associated with leadership and executive careers. UNESCO's @women4EthicalAI initiative, led by the visionary and inspiring Gabriela Ramos and Dr. Alessandra Sala, is fighting this pattern by developing frameworks for non-discriminatory AI and pushing for gender equity in technology leadership. Their work extends the UNESCO Recommendation on the Ethics of AI, a powerful global standard centering human rights in AI governance. Today's decision is whether AI will transform our world into one that replicates today's inequities or helps us build something better. Examine your AI teams and processes today. Where are the gaps in representation affecting your outcomes? Document these blind spots, set measurable inclusion targets, and build accountability systems that outlast good intentions. The technology we create reflects who creates it - and gives us a path to a better world. #InternationalWomensDay #AI #GenderBias #EthicalAI #WomenInAI #UNESCO #ArtificialIntelligence The Patrick J. McGovern Foundation Mariagrazia Squicciarini Miriam Vogel Vivian Schiller Karen Gill Mary Rodriguez, MBA Erika Quada Mathilde Barge Gwen Hotaling Yolanda Botti-Lodovico
-
Women are severely underrepresented across all seniority levels across the AI-industry globally. This is a pattern we're familiar with. It's also a pattern we have ways to break. Having timely data to understand the problem is a great first step. Kudos to Interface for launching this report last year, highlighting AI's missing link: The Gender Gap in the talent pool. (Siddhi Pal, Ruggero Marino Lazzaroni, Paula Mendoza) Analysis of data on nearly 1.6 million AI professionals worldwide reveals stark gender imbalances. Women comprise only 22% of AI talent globally, with even lower representation at senior levels – occupying less than 14% of senior executive roles in AI. Within the EU, the disparity is equally concerning. Europe has closed 75% of its gender gap , with Sweden and Germany among the top five European economies in closing the gender gap. However, the data reveals a stark contrast in the AI sector: Germany and Sweden have some of the lowest female representations in their AI workforces in the EU, at 20.3% and 22.4% respectively. This discrepancy raises serious questions about the unique barriers faced by women in the AI field. The data is from 2024. As this sector is moving at lighting speed I'd be curious to get an updated view? Addressing this diversity crisis is not just a matter of equality, but crucial for developing AI systems that work equitably for all of society. As the EU aims to become a global leader in AI, closing the gender gap must be a priority. The same goes for the nonprofit sector. Climate Collectives current cohort of CIVIC orgs that we're training on tech-agnostic AI skills have a 50-50 gender balance. Yes, the goal was set with intent, and required us to be thoughtful about participant selection - - but to those filling expert panels with only white males saying they couldn't find more diverse talent I call your bluff.