Fertility tracker apps are reshaping how millions monitor their reproductive health. From tracking menstrual cycles to predicting ovulation and monitoring pregnancy, these tools promise convenience and insight. But what do they cost in terms of privacy? Researcher Anna Ida Hudig, from the Compliant and Accountable Systems Group at the University of Cambridge, is examining how fertility trackers collect and share intimate data, and what users can do about it.
In a recent AI & Eqality PubTalk Hudig presented her research which takes a deep dive into the real-world data practices of fertility tracker apps and devices. These trackers collect highly intimate data, including menstruation dates, moods, activity levels, physical symptoms, sex drive, and, in some cases, data from physical devices monitoring body temperature and hormone levels. Recognizing the sensitive nature of this data, Hudig’s research aimed to explore the actual data-sharing practices of fertility trackers, understand user preferences and attitudes towards sharing this data, and provide recommendations for better transparency and control mechanisms.
Her team conducted a technical analysis of eight popular products, capturing data traffic to identify where user information was being sent and who might be receiving it. On average, each app interacted with over 54 IP addresses, connecting to more than 13 organizations across seven countries—many outside the EU, such as the US, Ukraine, and Russia. Especially troubling is that data transmission often occurred at the exact moment users entered personal details like menstrual dates or sexual activity.
To understand how users feel about this, the team surveyed over 160 participants. They introduced a fictional app, “Strawberry,” and asked respondents to indicate which data they would share, with whom, and under what conditions. The results were striking: while participants were more open to sharing de-identified data with doctors or researchers, there was strong reluctance to share with advertisers or employers. Many expressed skepticism about whether “de-identified” data could truly stay anonymous.
Going further, Hudig’s team conducted co-design workshops to explore how fertility apps could better support transparency and user control. Participants expressed varying concerns about data sharing—some limited the information they provided, while others feared data could be used to regulate women’s bodies.
The focus groups generated ten concrete mechanisms to improve transparency and control across the app lifecycle, including selection, setup, use, and post-use. Ideas included short, clear privacy summaries in app stores, onboarding pop-ups indicating mandatory versus optional data, real-time alerts for policy changes, and easy tools for data export or deletion. The discussions also highlighted concerns about user burden, the need for support, and considerations around gender and data donation.
The takeaway? Empowering users isn’t just about offering options—it’s about designing tools that are genuinely usable, respectful of privacy, and transparent by default.
This research builds upon Hudig’s previous work on the Internet of Things (IoT) which looked into smart devices and data sharing. Hudig analyzed 11 different types of smart devices, including those collecting health and children’s data. She set up a data monitoring infrastructure to capture network traffic between these devices, their associated apps, and the internet.
Hudig’s work reminds us that fertility data is deeply personal. If these apps are to serve users rather than exploit them, transparency, ethical design, and regulation must catch up with the tech.
About Anna Ida Hudig
Anna Ida Hudig is a PhD researcher at the University of Cambridge, based in the Department of Computer Science and a member of the Compliant and Accountable Systems Group. Her work explores how digital systems can be designed and governed to prioritise transparency and accountability. Anna Ida combines methods from computer science, law, and the social sciences. She holds an LLB in Law and a BSc majored in Artificial Intelligence from the University of Amsterdam, and completed an MPhil in Engineering for Sustainable Development at the University of Cambridge (graduating with distinction). Beyond academia, she has experience at a law firm, contributing to various European projects on cybersecurity, autonomous driving, and responsible AI.