Transit Management with AI

About the video

In this excellent case study Francesca Lucchini of the National Center for Artificial Intelligence (CENIA) in Chile walks through how multidisciplinary teams, essential questions, interrogating stereotypes and more can help researchers  catalyze  agency and a human rights perspective in their work by beginning with decisions on what is measurable.

About the author

Francesca Lucchini is part of the tech transfer team at the National Center Artificial Intelligence, Chile (CENIA). She studied computer science as a pre-grad student at Pontificia Universidad Católica de Chile. Then continued her studies at the same university with a master’s degree in Engineering, focusing on AI applied to city graphs.

Currently, she works as a project director for a State Public Contest of the Chilean Undersecretariat of Transportation.
Francesca is interested in gender equality and gender parity in STEM careers, specifically computer science and AI. She developed and imparted short courses/workshops focused on raising interest in AI among high school and college students.

Recommended resources

Gender/Age Bias Resources

→ Brandao, Martim. “Age and gender bias in pedestrian detection algorithms.” arXiv preprint arXiv:1906.10490. Appeared at the Workshop on Fairness Accountability Transparency and Ethics in Computer Vision (FATE CV) at CVPR 2019 (

→ Wu, Wenying, et al. “Gender classification and bias mitigation in facial images.” Proceedings of the 12th ACM Conference on Web Science & CVPR. 2020 (


→ StrongSORT: Make DeepSORT Great Again (

→ Observation-Centric SORT: Rethinking SORT for Robust Multi-Object Tracking (

→ ByteTrack: Multi-Object Tracking by Associating Every Detection Box (

→ BoT-SORT: Robust Associations Multi-Pedestrian Tracking (


Join our community

We are committed to advancing human rights-based approaches in AI & encourage anyone interested in learning more from a global perspective to explore and contribute to our community!