Post-Roe, Your Period App Can — And Will — Be Used Against You
Period apps and other FemTech tools are meant to empower users by providing valuable data and insights for managing reproductive and sexual health. In a post-Roe world with Project 2025 looming, it can be catastrophic if this intimate data falls into the wrong hands — even in states where abortion is legal.
This is part 1 of a series on privacy and FemTech. See part 2, How to Track Your Reproductive Health Data While Protecting Your Privacy for practical privacy tips.
Travel Bans and Pregnancy Criminalization Are the New RealityThe Lincoln Project ad " State Line " paints a chilling, dystopic picture in which a father and daughter are arrested for attempting to cross state lines for an abortion. The officer knows the daughter's name. He knows she's eight weeks pregnant. He knows she's been spotting. And he knows she's not headed to visit her sister as she claims, but to "a women's health clinic in one of those abortion states." He arrests the father for "crossing state lines to obtain an abortion for a minor under his care" and the daughter for "evading motherhood."
This is not hyperbolic. Roe's protections are in the rearview. The wheels are in motion to implement Project 2025 's "pro-life, pro-family" agenda under Trump 2.0. Pregnancy-related criminal prosecutions are peaking post-Dobbs, according to non-profit Pregnancy Justice US . So is reproductive surveillance. To illustrate:
Often, abortion restrictions and anti-LGBTQIA policies coincide, as a recent Mother Jones analysis shows. According to GenDemocracy — the non-profit behind the Stop the Coup 2025 initiative : "It's not about the sex differences or biology, but their need to keep the lines and gender boxes drawn hard and sharp, like the battlefront they represent."
Policing the most intimate aspects of people's lives takes data and surveillance, hence the post-Roe calls to "delete your period apps!"
FemTech Makes Reproductive 'Dataveillance' TrivialFemTech aims to empower users by helping them gain insights into their reproductive and sexual health. But using them can turn intimate – and potentially incriminating – data into a ready-made dossier for law enforcement, a bounty hunter, a jilted ex or an insurance company to weaponize against you.
FemTech refers to "technology including products, services, diagnostics and software addressing health and wellness concerns that solely, disproportionately or differently affect women, girls, non-binary folks, trans people and those assigned female at birth" according to FemTech Canada. These include period trackers, ovulation predictors, pregnancy apps and wearables like wellness rings or watches and menopause hormone trackers that transmit sensory or biometric data to the apps for analysis and monitoring.Sadly, many of these apps are riddled with third-party trackers that share user data across murky ecosystems for targeted advertising, analytics and algorithmic improvement. Installing a FemTech app on your phone can turn it into a digital ankle bracelet, making 'dataveillance' trivial. Data surveillance and privacy expert Roger Clarke , who coined the term, defines it as, "The systematic creation and/or use of personal data systems in the investigation or monitoring of the actions or communications of one or more persons."
FemTech Users Value Privacy, But They Also Value Their Health DataFemTech users take their privacy seriously. But they also value the ability to track and gain insights from their detailed reproductive and sexual health data – even in extreme conditions. To illustrate, a recent study by FemTech 'unicorn' Flo Health was able to analyse data extracted from over 87,315 active Flo Health users in Ukraine who consistently tracked their symptoms before and immediately after Russia's February 2022 full-scale invasion. (Flo used the data for a study on pain perception under acute stress).
A 2022 Wired analysis found the mass post-Roe deletion of FemTech apps was swiftly followed by a wave of new downloads. Users weren't abandoning their FemTech apps. They were switching to new apps that promised greater privacy and security. Yet the analysis showed that many apps fell short of their privacy promises. That same year, Mozilla slapped a "Privacy not Included" label on 18 of 25 period and pregnancy tracking apps it had reviewed. Only a handful were 'privacy-first'.
FemTech Apps Are Marketed as Healthcare, but Regulated Like Consumer Apps. This Typically Means Lax Privacy Protections."Health and commercial data often fall under different regulatory requirements. When you sell commercial apps for health, the user may not understand that their health data has less safeguards," Privacy Engineer and Binary Tattoo Founder Cat Coode explained to me. "And app and product developers aren't forced by regulatory requirements to put those safeguards in."
This effectively treats allows healthcare apps to self-regulate while handling some of the most high-risk and consequential data, despite the tech industry's poor track record .
Unfortunately, too many FemTech tools are optimized for commercial surveillance and monetization rather than privacy and medical reliability .
Data That Is Centrally Stored or Shared Across networks Could Fall Into the Wrong HandsWhen you use FemTech tools with poor privacy protections, your sensitive reproductive and sexual health data is up for grabs by anyone with a credit card (including law enforcement) – accuracy and due process be damned.
Zach Edwards, data supply auditor and founder of boutique analytics firm Victory Medium cautioned in an email to me that, "If data is stored, it's at risk of being accessed via government subpoenas. Organizations who provide period tracking apps, yet also store location data or data about their users on centralized servers, are putting all those users at risk."
"Organizations who provide period tracking apps, yet also store location data or data about their users on centralized servers, are putting all those users at risk."Even aggregated and de-identified user data used for research or prediction improvement can provide a "snapshot" for law enforcement to zero in on what to request in a subpoena, warns Edwards. It's important to minimize what's collected and how long it's retained. Without detailed assurances regarding how providers manage re-identification risk, users should assume they can be identified in the data.
It's the Trackers, StupidWhile app providers make grand claims, too often they wait until they are caught by investigative journalists or regulators to truly invest in privacy. In 2023 , the Federal Trade Commission ("") found the makers of ovulation tracker app Premom had violated the Health Breach Notification Rule by sharing users' health data with advertisers via integrated Software Development Kits (""). It seems the FTC's 2021 settlement with Flo Health on similar grounds (which Flo denies ) did not incentivize Premom's makers to put privacy first.
Flo Health has since become the first FemTech app to be dual-certified under ISO for information security and personal information management. It also created the award-winning Anonymous Mode that lets users use the app without associating their data with device identifiers, names or contact details. Edwards has praised this move.
Jeff Jockish, CXO and Privacy Researcher at ObscureIQ, has intimate knowledge of the data broker world. His firm specializes in performing "digital footprint wipes" for clients seeking to reclaim their online privacy. He told me by phone that users should assume that any app they are logged into is tracking them.
"But some specific organizations created those lists, and so if a state law somehow ever is passed that empowers investigations into people who are pregnant, it would be dangerous for any advertising organizations to continue to build those audience segments, for fear of the government attempting to subpoena or access that data." Someone, Somewhere, Is Piecing It AltogetherEdwards agrees that FemTech apps that create unique user IDs or rely on AdIDs put their users at serious risk. In the shady world of data brokers, someone, somewhere, is piecing the data together to create user segments and profiles that can be sold to clients to use as they wish. "But some specific organizations created those lists, and so if a state law somehow ever is passed that empowers investigations into people who are pregnant, it would be dangerous for any advertising organizations to continue to build those audience segments, for fear of the government attempting to subpoena or access that data."
It's an unregulated data buffetJockish told me FemTech users shouldn't trust providers' claims that they won't sell their health data. Even if they don't technically "sell" it, sharing it widely for ad-targeting or analytics exposes users to serious risks. App providers have plausible deniability because these apps are not subject to health privacy laws that clearly define "health data."
Reporting from Vice shows it's possible to infer reproductive status or gender identity from FemTech app installs and usage, even without accessing user-inputted data. Location tracking and sensitive points of interest can be particularly revealing, as a chilling 404 Media exposé demonstrated. The ubiquity of generative AI supercharges the ability to infer or predict patterns or reveal someone's identity from seemingly anodyne, "anonymous" data.
This is rarely regulated as health data.
In 2023, Washington enacted the My Data My Health Act , the first US law to protect health data outside of HIPAA. It uses a broad definition of consumer health data that includes inferences based on precise geolocation. This is a promising state-level step. But it's the exception — not the rule — in the US.
'Prompt Surveillance': AI Has Entered the Chat "While companies present only positive aspects, they're actually accumulating vast datasets of human thoughts and behaviours without our consent or knowledge. This raises critical questions about ownership, control, and potential consequences for our privacy and autonomy." Vast Datasets of Human Thoughts and BehavioursIntegrating AI chatbots into FemTech apps can introduce novel risks. Jackie Singh, former Director at the Surveillance Technology Oversight Project (STOP), warns that conversational AI introduces the new threat of ' Prompt Surveillance ' in a recent Substack post: "Prompt Surveillance is the systematic collection, analysis, and potential exploitation of user inputs (prompts) in AI Large Language Models (LLMs), encompassing both the intentional gathering of data for model improvement and the incidental accumulation of user information. This practice involves the retention, processing, and potential misuse of prompts, which may contain sensitive personal, professional, or proprietary information, creating risks to privacy, intellectual property, and individual autonomy."
More Data and Insights That Can Be MisusedSingh explained to me via text how Prompt Surveillance can impact reproductive freedom: "As a cybersecurity expert and mother of three daughters, I want women to understand the complex reality behind AI: it's not just a tool with benefits, but also one with inherent risks. While companies present only positive aspects, they're actually accumulating vast datasets of human thoughts and behaviours without our consent or knowledge. This raises critical questions about ownership, control, and potential consequences for our privacy and autonomy."
Conversations themselves can be 'hacked.' For example, a recent Wired explained how AI models can infer sensitive personal information such as race, location, and occupation from innocuous chats. Ars Technica has reported on a more elaborate hacking technique uncovered by researchers that enabled them to read private, encrypted AI chats.
There's Light At The End of The Tunnel — But Proceed With CautionFemTech emerged to fill a serious void in reproductive and sexual healthcare. There is a historical sex and gender bias in medicine because women, transgender and intersex people are often excluded from the research and development pipeline. As abortion-restrictive states become OBGYN deserts, the need for reliable health resources, data and insights will only increase. But if FemTech providers can't protect users' privacy, they put their bodily autonomy in jeopardy.
Corporate-driven datafication in FemTech exploits an unacceptable data gap while purporting to give users greater control over their health. Their users become bodily objects to mine for profit, not patients whose health, well-being, and privacy are paramount.
You shouldn't have to choose between your healthcare and your privacy. Thankfully, there are privacy-first and privacy-enhanced period apps available. There are also measures you can take to better protect your privacy. In Part 2, we explain how you can track your intimate data while reducing the risks of being tracked.