BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//AI and Equality - ECPv6.13.2.1//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-ORIGINAL-URL:https://aiequalitytoolbox.com
X-WR-CALDESC:Events for AI and Equality
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:Europe/London
BEGIN:DAYLIGHT
TZOFFSETFROM:+0000
TZOFFSETTO:+0100
TZNAME:BST
DTSTART:20260329T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0100
TZOFFSETTO:+0000
TZNAME:GMT
DTSTART:20261025T010000
END:STANDARD
END:VTIMEZONE
BEGIN:VTIMEZONE
TZID:America/New_York
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20260308T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20261101T060000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=Europe/London:20260305T150000
DTEND;TZID=Europe/London:20260305T160000
DTSTAMP:20260415T221206
CREATED:20260218T102325Z
LAST-MODIFIED:20260218T105000Z
UID:10000024-1772722800-1772726400@aiequalitytoolbox.com
SUMMARY:SafeHer: A Reporting Tool for Technology-Facilitated Gender-Based Violence in Kenya with Lilian Olivia Orero | AI & Equality Open Studio
DESCRIPTION:Open Studio | SafeHer: A Reporting Tool for Technology-Facilitated Gender-Based Violence in Kenya. SafeHer is a reporting tool developed by SafeOnline Women Kenya (SOW-Kenya) that enables women and girls in Kenya to report incidents of technology-facilitated gender-based violence. The tool completed user testing with 36 participants and recently won the National Models for Women’s Safety Online (NMWSO) Safety by Design Award from IREX and the Gates Foundation. With the Google Play Store listing in progress and national scaling planned for 2026\, this is an ideal moment to receive community feedback on the reporting design\, methodology and implementation framework. Explore: https://safeherkenya.org/ \nAbout the speaker:\nLilian Olivia Orero is the Founder of SafeOnline Women Kenya (SOW-Kenya)\, where she leads the development of SafeHer\, a mobile app enabling women and girls in Kenya to report technology-facilitated gender-based violence. She holds an LLM with Distinction in Law\, Innovation & Technology from the University of Bristol\, where her dissertation examined how dark patterns on social media platforms enable gendered cyberbullying under EU Digital Services Act regulation. \n\n\nRegister here via our community on Circle
URL:https://aiequalitytoolbox.com/event/safeher-a-reporting-tool-for-technology-facilitated-gender-based-violence-in-kenya-with-lilian-olivia-orero/
ATTACH;FMTTYPE=image/png:https://aiequalitytoolbox.com/wp-content/uploads/2026/02/3.png
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20260311T163000
DTEND;TZID=America/New_York:20260311T180000
DTSTAMP:20260415T221206
CREATED:20260303T133922Z
LAST-MODIFIED:20260303T134024Z
UID:10000034-1773246600-1773252000@aiequalitytoolbox.com
SUMMARY:CSW70 | When Algorithms Discriminate: Gender Bias in Justice Systems
DESCRIPTION:Wednesday\, 11 March 2026 \n4:30 – 6:00 PM ET \nNGO CSW  \n10th Floor\, Church Center of the United Nations \n777 United Nations Plaza\, New York \n  \nAn In Depth Discussion: What happens when courts replace judges with computer algorithms? We are told these systems are “objective” and “fair”  but the evidence tells a different story. From bail decisions to sentencing\, algorithms are making life-changing choices about women based on biased data and male-centered assumptions. A woman seeking justice after assault may find her credibility automatically questioned. A mother may be flagged as “high risk” simply because of where she lives or her employment history. Meanwhile\, these same systems treat men’s violence as more predictable and less dangerous. \nThis is not science fiction\,  it is happening right now in courts worldwide. Join us to uncover how technology is creating new barriers to justice for women and girls\, and what policy solutions can effectively address it. \n  \nLaura Nyirinkindi | UN Special Procedures Member\, Working Group on discrimination against women and girls\nAfrica Regional Vice President of the International Federation of Women Lawyers (Federación Internacional de Abogadas) \nFernanda K. Martins | Fundacion multitudes\, Director of Strategy and Advocacy \nCaitlin Kraft–Buchman | Women At The Table; CEO \n 
URL:https://aiequalitytoolbox.com/event/csw70-when-algorithms-discriminate-gender-bias-in-justice-systems/
ATTACH;FMTTYPE=image/png:https://aiequalitytoolbox.com/wp-content/uploads/2026/03/Designing-AI-for-Human-Agency-29.png
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=Europe/London:20260326T150000
DTEND;TZID=Europe/London:20260326T160000
DTSTAMP:20260415T221206
CREATED:20260218T102603Z
LAST-MODIFIED:20260218T104946Z
UID:10000025-1774537200-1774540800@aiequalitytoolbox.com
SUMMARY:Beyond the Math: Why AI Fairness Needs a Feminist Lens with Marie Mirsch | AI & Equality Pub-Talk
DESCRIPTION:🔗 Access paper: https://link.springer.com/article/10.1007/s43681-025-00926-y \nAlthough research on algorithmic fairness is inherently interdisciplinary\, many proposed fairness approaches remain predominantly technical in their treatment of societal concepts such as fairness and justice. While these approaches often claim to operationalize insights from the social sciences\, they frequently do so in ways that appropriate rather than meaningfully engage with the underlying theories. This paper critiques this practice through the lens of intersectionality. Adopting a “calling in” rather than “calling out” stance toward AI practitioners\, it offers actionable guidance on how intersectionality can be substantively incorporated into technical work\, thereby recentring social science theory within the field of algorithmic fairness. \nAbout the speaker:\nMarie Mirsch\, M.Sc.\, is a research assistant and doctoral candidate at RWTH Aachen University. She conducts research at the intersection of mathematics\, ethics\, and social sciences\, intending to anchor diversity perspectives in technology. Her interdisciplinary research focuses on intersectionality in the context of algorithmic fairness – a central topic of current AI research. She also considers aspects of procedural fairness and participatory approaches\, such as including citizens’ perspectives. As part of her work at the bridging professorship “Gender and Diversity in Engineering”\, she also deals with ethical issues relating to using artificial intelligence in engineering. A Research Fellowship from the BMBF-funded AI Campus supports her research. \nAs project manager of the Responsible Research and Innovation (RRI) Hub at RWTH Aachen University\, she coordinates and implements national and international projects to strengthen social responsibility in technology development as part of the “ENHANCE – European Universities of Technology Alliance”. \nHer teaching activities include the seminar “Responsible AI for Engineers” at RWTH\, the online course “Responsible Innovators for Tomorrow” as part of the ENHANCE alliance\, and the workshop “Responsible AI” at the University of Koblenz. \n\n\nRegister here via our community on Circle
URL:https://aiequalitytoolbox.com/event/beyond-the-math-why-ai-fairness-needs-a-feminist-lens-with-marie-mirsch/
ATTACH;FMTTYPE=image/png:https://aiequalitytoolbox.com/wp-content/uploads/2026/02/4.png
END:VEVENT
END:VCALENDAR