U.S. Supreme Court illustration||
supreme court

Will encryption be the new pro-choice litmus test?

In the weeks between when Politico reported on a draft opinion the Supreme Court's final ruling on June 24 overturning Roe v. Wade, many Americans were forced to grapple with the ways that their digital footprints could soon be used to prosecute efforts to seek reproductive healthcare.

There’s been a renewed push for federal privacy regulation even as some legislators have appealed directly to companies to take action. Dozens of lawmakers, led by Ron Wyden (D-Ore.), asked Google to limit collection and retention of location-tracking data in light of how it may be used to prosecute people in areas where abortion is criminalized.

Riana Pfefferkorn, encryption policy expert and research scholar at the Stanford Internet Observatory, argues the practical outcome of overturning Roe v. Wade — which triggers laws in several states that criminalize abortion — should create a new litmus test for politicians who identify as pro-choice: if they support strong encryption. 

“You cannot be both pro-choice and anti-encryption anymore,” she said, referencing the decades-long debate over who should have access to the strongest methods to digitally protect data. 

This interview has been edited for length and clarity. It was originally posted on June 6, 2022 and the introduction was updated following the Supreme Court's June 24 ruling overturning Roe v. Wade.

Andrea Peterson: What’s the connection between reproductive healthcare access and encryption? 

Riana Pfefferkorn: We have digitized every part of our society, from obtaining information about healthcare and medical procedures to telehealth appointments to talking to trusted friends and family members. 

So we are in a position where now a lot of our conversations around abortion access, or seeking information about where to find a clinic or how to get medication, or having a consultation with a doctor who may be helping to obtain medications for self administered abortion, for example — all of those things are digitized now. 

And that means that absent strong encryption, there's information that can now potentially be accessible to law enforcement with the right legal process. 

There have not been a lot of people that I have talked to who have drawn the connection between the fight for reproductive freedom and the encryption policy debate[…] And I realized that probably one reason for that is that throughout the whole time that strong encryption has been commercialized to a point where it is now in widespread consumer use, abortion has always been a privacy right rather than a crime.

AP: How does that change the policy conversation?

RP: Now that abortion is about to be criminalized across half the country, we are facing a situation where we have to grapple with what it means for the encryption debate where the central crux of it has always been about, “Okay, well, how do we balance (and I say that in scare quotes) privacy rights and criminal investigations and criminal enforcement.” 

There's also all this other information out there that can potentially be relevant to an investigation trying to criminalize somebody who has sought or obtained an abortion, who has experienced a miscarriage or stillbirth, or who has helped to perform an abortion.

And so as we look at this kind of “golden age of surveillance” bed that we've made and that we're now lying in, I think the fact that we have a country that is deeply divided in terms of what laws will govern abortion and reproductive care more generally points to divide and how we need to be thinking going forward about the overall legitimacy of the laws on the books in the United States — who gets criminalized, what behavior gets criminalized, instead of being seen as a fundamental right.

You cannot be both pro-choice and anti-encryption anymore. It is no longer a tenable position if you are pro-choice to say encryption is a threat to public safety and law enforcement needs to be able to investigate criminal activities, therefore, we need to weaken or otherwise provide backdoors and encryption to allow law enforcement access to encrypted data because now those crimes include reproductive care.

We don't know how, despite several decades of research into this area, to build a backdoor to encrypted data that is secure and that can only be utilized by the good guys who are intended to be able to make use of it.

AP: How has law enforcement run up against mathematical principles when it asks for a special access to encrypted communications? How does having access for some law enforcement potentially weaken the overall security?

RP: So the thing about what some call an exceptional access mechanism to encrypted data is that it's just a hole in your defenses. My colleague Alex Stamos has likened it to drilling a hole in the windshield of a car — you're only needing it to be one little hole in this thing that's supposed to provide security against the outside world, but not only does that now provide a means for the outside world to get in it also weakens the overall security of the system. 

And when we're talking about complex systems, it can be very difficult to know how introducing a vulnerability in Point A will affect other places within the overall system. So we've had historically this somewhat simplistic stance of, “Well, we're the good guys because we're American law enforcement or we're British law enforcement or whatever, and we're a rule of law country. We're not China, we're a democracy. And we want you to only introduce this access mechanism just for the good guys and deny it to the bad guys…”

But the bad guys are going to find it if you have introduced this vulnerability into the system — a determined attacker is going to be able to find and exploit that, and it won't just be used by the good guys anymore. 

We don't know how, despite several decades of research into this area, to build a backdoor to encrypted data that is secure and that can only be utilized by the good guys who are intended to be able to make use of it.

AP: So how does that change, or does it at all change, the role of private companies?

RP: I think that anybody who collects data from their users should be, as kind of a moral obligation, examining now: where do they collect information, how do they store it, how long do they store it for, and how easy would it be to disclose? Because, like in the movie “Field of Dreams” — if you build it, they will come. If you start collecting and storing data about people, then eventually law enforcement will come knocking on your door and asking to get access to it. 

However, as the Wyden letter points out, there's no law in the books that obligates companies like Google or Apple or anybody else to collect all of these vast streams of data about their users, whether that's location data or anything else. They do it voluntarily for their own business purposes. 

But what they collect for their own purposes typically is available with the right legal process — and so companies need to be thinking about doing data minimization on the front end. Why are they collecting all the data they're collecting? Do they need to be doing so in the first place? Are there ways that they can reduce that data collection, purge what they've got? And are there other ways to store it or to transmit it? 

For example, can they deploy end-to-end encryption in additional places that they haven't already? Are they willing to stand up and push back against compelled disclosure when — not if, but when — they inevitably start receiving requests for data about particular users pursuant to investigations into reproductive healthcare crimes?


(Image: Parsoa Khorsand via Unsplash)

AP: And how have we seen companies so far react to this change in dynamic?

RP: I don't think we've seen a lot of open acknowledgement or response from at least even the major players, much less smaller services out there that collect user data. A lot of them, I'm sure, are internally reckoning with what they could be doing or should be doing now to reduce their utility as essentially an attack surface for endangering people who have sought or obtained an abortion or who have helped or performed an abortion. 

But I haven't seen any public statements. We've seen companies talking about how they'll provide travel support for employees if they or their families need to travel outside of abortion ban states in order to obtain abortion care. But we haven't necessarily seen any commitments from big companies like Google or Meta or Apple or Amazon or whoever about what they're going to do with regards to these treasure troves of data that they hold. 

And those are just the most obvious companies that come to mind. We're not even talking necessarily yet about all of the ad-tech companies and other lower level parts of the overall surveillance capitalism sphere. This is something that you get a little bit of in Senator Wyden's letter because he is the sponsor of a bill called the Fourth Amendment is not for Sale Act, which is an attempt to curtail the sale of data that is held by data brokers to law enforcement who otherwise would need to go and obtain that information with a warrant. 

AP: What else did you take away from the Wyden letter? 

RP: I think one of the points of the Wyden letter is to say just because a warrant can get issued for data, that doesn't necessarily make handing that data over okay, right? 

The first level is to say we need to make sure that law enforcement is dotting the i's and crossing the t's. or information that is extremely revealing about people — whether that is a bunch of information about their movements, as in location data, whether that's the contents of their emails — then they ought to have to go get a warrant rather than being able to get it with lesser legal process or purchase it from a data broker like any other customer, without any legal process at all. 

But the next conversation to be having — and this is something that the ACLU has talked about in some of their responses to this intersection between digital surveillance and the end of Roe — is that there are some types of information that even if you do have a warrant, the results of that information getting into government hands may be devastating and maybe should be off the table completely.

Traditionally, when we've talked about those, the biggest piece of information or the biggest type of surveillance that we have talked about with regards to saying this should just be totally off limits, warrant or no warrant, has been facial recognition. 

But now, as I said, abortion going from a privacy right to a crime. I think it's also time to step back and say, “Oh my god, isn't this the kind of data where it doesn't matter if you have a warrant disclosed or not, somebody's still going to potentially go to jail over their bodily autonomy?”

Encryption protects privacy, it protects security. 

It enables human dignity and that is the same thing that reproductive healthcare does. We have just as much of a right to both of those things freely available without encumbrance. They should not be controversial in either case. 

AP: What do you think is next for U.S. policy debates over encryption?

RP: The silver lining that I hope to get out of losing my f–cking constitutional rights would be if this can help to show Democratic lawmakers why they cannot talk out of both sides of their mouth anymore when it comes to encryption. 

Encryption protects privacy, it protects security. 

It enables human dignity and that is the same thing that reproductive healthcare does. We have just as much of a right to both of those things freely available without encumbrance. They should not be controversial in either case. 

So my hope is that as the devastating impact of overturning Roe v. Wade decision starts to be fully felt — not just within the states that will immediately criminalize abortion, but within all the other states that will have to handle the overflow and all the companies that are international in scope and maybe headquartered in places like California or Washington that have strong reproductive care — that it will be time to sort of rethink…

We can now see criminal law enforcement for what it is, which is we're not the good guys. American laws are not necessarily, de facto, any better than the laws on the books in other, more repressive countries that don't have the rule of law. 

So I don't know what will necessarily happen going forward.

I think we're very soon going to start seeing the ways that digital evidence will get used in prosecutions for people who seek or provide reproductive care. We've already seen that happen in some cases that have been brought already where people have been prosecuted after presenting at emergency rooms losing their pregnancies and we're going to see that happen more. 

Hopefully, that will show to people who think that it is an abomination to threaten an arrest and imprison people for bodily autonomy that it is that encryption is something that is going to be indispensable to helping defend bodily autonomy.

Get more insights with the
Recorded Future
Intelligence Cloud.
Learn more.
No previous article
No new articles

Andrea Peterson

Andrea Peterson

(they/them) is a longtime cybersecurity journalist who cut their teeth covering technology policy at ThinkProgress (RIP) and The Washington Post before doing deep-dive public records investigations at the Project on Government Oversight and American Oversight.