AI and Equality

Library

Technology-Facilitated Gender-Based Violence in Africa: When AI Becomes a Weapon

About the Case Study

This case details how AI systems are weaponized to perpetuate gender-based violence across 11 African countries. It highlights cases like the deepfake attack on Ethiopian Mayor Adanech Abiebie and the coordinated harassment against Brenda Biya, daughter of Cameroon’s President, revealing how engagement-optimized algorithms amplify harmful content. The research by Code for Africa exposes systematic failures in content moderation due to cultural and linguistic gaps, and proposes architectural changes, community ownership, and regulatory frameworks to embed human rights principles into AI design, aiming to reclaim AI for human dignity.

 

The Toolbox applies a Human Rights-based AI Lifecycle Framework, integrating reflective questions and the Human Rights Impact Assessment (HRIA) developed with the Alan Turing Institute.

It emphasizes participatory, multidisciplinary approaches and is rooted in feminist, decolonial, and Justice, Equity, Diversity, and Inclusion (JEDI) principles and incorporates lessons from emerging digital rights challenges, ensuring AI systems are designed with safety and dignity at their core.

We’re creating a global community that brings together individuals passionate about inclusive, human rights-based AI.

Join our AI & Equality community of students, academics, data scientists and AI practitioners who believe in responsible and fair AI.