Deep Dive Episode 234 – Dobbs and the Potential Implications for Data Privacy

The Supreme Court’s recent abortion decision in Dobbs v. Jackson Women’s Health Organization will no doubt have many ramifications. One of the more unusual questions is the impact that Dobbs might have on data privacy. It has long been the case, for example, that cell phone location data can be used to identify certain personal behavior patterns, such as routine attendance at church. Some are now concerned that location data may be used to identify pregnant women by the locations they visit – potentially exposing them to civil or criminal charges as the underlying substantive abortion law changes. Other women are deleting period tracking apps from their phones for much the same reason. In this podcast, experts explore and debate these issues.

Transcript

Although this transcript is largely accurate, in some cases it could be incomplete or inaccurate due to inaudible passages or transcription errors.

[Music and Narration]

 

Introduction:  Welcome to the Regulatory Transparency Project’s Fourth Branch podcast series. All expressions of opinion are those of the speaker.

 

On September 1st, 2022, The Federalist Society’s Regulatory Transparency Project hosted a virtual event titled “Dobbs and the Potential Implications for Data Privacy.” The following is the audio from that event.

 

Steven Schaefer:  My name is Steven D. Schaefer, and I am the director of the Regulatory Transparency Project. Welcome to today’s Regulatory Transparency Project virtual event. We are excited to host a discussion entitled “Dobbs and the Potential Implications for Data Privacy.” On June 24th, 2022, the US Supreme Court issued its decision in Dobbs v. Jackson Women’s Health Organization implicating abortion. One of the questions is the impact that Dobbs might have on data privacy. We are enthusiastic to have with us a stellar panel of experts with diverse points of view on this topic. Thank you to our panelists for being with us today. In the interest of time, I will keep our guests’ bios brief, but please find out more about them at regproject.org. That is regproject.org.

 

Stewart A. Baker is a partner in the law firm of Steptoe and Johnson in Washington D.C. His law practice covers cyber security, data protection, homeland security, travel, and foreign investment regulation. He’s been awarded one patent. From 2005 to 2009 he was the first Assistant Secretary for Policy at the Department of Homeland Security. Mr. Baker has been General Counsel of the National Security Agency and general counsel of the commission that investigated weapons of mass destruction intelligence failures prior to the Iraq war. He is the author of Skating on Stilts, a book on terrorism, cybersecurity, and other technology issues. He also hosts a weekly podcast, the Cyberlaw Podcast.

 

Jane R Bambauer is the Dorothy H. and Lewis Rosenstiel Distinguished Professor of Law at the University of Arizona James E. Rogers College of Law. Professor Bambauer teaches and studies the fundamental problems of well-intended technology policies. Her research assesses the social costs and benefits of big data and how new information technologies affect free speech, privacy, and competitive markets. She also serves as the Co-Deputy Director of the Center for Quantum Networks, a multi-institutional engineering research center where she facilitates research on economic and regulatory policy for emerging markets and quantum technology. Professor Bambauer’s work has been featured in over 20 scholarly publications including the Stanford Law Review, the Michigan Law Review, and many others. She holds a BS in mathematics from Yale College and a JD from Yale Law School.

 

Danielle K. Citron is the Jefferson Scholars Foundation Schenck Distinguished Professor in Law and the Caddell and Chapman Professor of Law at the University of Virginia School of Law. Professor Citron writes and teaches about privacy, free expression, and civil rights. She serves as the Inaugural Director of the School of Law Tech Center which focuses on pressing questions in law and technology. She is the Vice President of the Cyber Civil Rights Initiative. Her latest book, The Fight for Privacy: Protecting Dignity, Identity, and Love in the Digital Age will be out in October 2022 from Norton. Her first book, Hate Crimes in Cyberspace from Harvard University Press was widely praised in published reviews. She has published more than 50 articles and essays including in the Yale Law Journal and Michigan Law Review and many others. For the past decade, Professor Citron has worked with lawmakers, law enforcement, and tech companies to combat online abuse to protect intimate privacy.

 

Our moderator today will be Paul Rosenzweig. He is an accomplished writer and speaker with a national reputation in cybersecurity and homeland security. He is the founder of Red Branch Consulting PLLC, a homeland security consulting company. He is also a senior advisor to The Chertoff Group. Mr. Rosenzweig formerly served as Deputy Assistant Secretary for Policy in the Department of Homeland Security. He is a Professorial Lecturer in Law at George Washington University and a Senior Fellow in tech, law, and security program at the American University Washington College of Law. He is a contributor at Lawfairblog.com. He is a member of the ADA Cybersecurity Legal Task Force and the United States Court of Appeals for the District of Columbia Circuit Advisory Committee on Admissions and Grievances. He is a graduate cum laude of the University of Chicago Law School and the author of Cyber Warfare: How Conflicts in Cyberspace are Challenging America and Changing the World.

 

After the discussion between our panel of experts, if time allows, we will go to audience Q&A. Please enter any questions you have into the Q&A function at the bottom of your Zoom window. Note, as always, The Federalist Society and Regulatory Transparency Project take no position on particular legal or public policy matters. All expressions of opinion are those of the speakers joining us. Paul, over to you.

 

Paul Rosenzweig:  Thanks Steven for that introduction of our topic and our guests. I’m delighted to welcome everybody to this webinar discussion. The Dobbs decision in June was an earthquake in the legal realm and it will have ramifications across everything from politics to morality to legal doctrinal analysis to the substantive law of abortion. One of the most surprising, at least to me, aspects of that decision that, frankly, I would not have anticipated prior to its coming to the fore, is how Dobbs is — or may affect online privacy issues.

 

To cite just one example, it was recently reported that in a case out of Nebraska where the Nebraska police are investigating a woman for allegedly assisting in an illegal abortion. They have sought from Facebook the private messages of that individual implicating both, obviously, the abortion issue itself directly but also, for our purposes today, the question of when and how data privacy issues will intersect with the abortion right.

 

I am sure that everybody on the call and everybody listening in understands that data privacy is a broad and floppy subject ranging from messaging to geolocation to other aspects of our digital selves, and I hope that in this coming 45 minutes to an hour we’re going to be able to explore all of how that bit of modern reality intersects with this new understanding of personal privacy that arises in the post Dobbs world.

 

So to begin, let me start by asking Jane to kind of introduce us to the topic, examine and expound upon the scope of the problem, how it connects to Dobbs. Maybe use the Nebraska case I mentioned briefly as a jumping off point and tell us a little bit more about it. And in the end give us all, both on the panel and in the listening audience, a kind of baseline of why it is we’re here and why this surprising turn of events has taken at least the data privacy world a bit by storm. Jane?

 

Jane Bambauer:  Yes. You’re right. There are opinion pieces and news items too that have been quite persistent since the original Dobbs opinion was leaked. So there’s a lot of focus on whether data that was collected for some other purpose might be repurposed in order to enforce the new abortion laws and especially criminal laws that may be coming down the pipes.

 

So I’m going to talk about the Nebraska case in a little bit but let me start with I think some data sources that are more likely to be used or at least are more likely animating the anxiety here. So this would include location data that everyone’s cell phone provider collects and also that many of your apps collect certainly your GPS app and whatnot. And then also things like Google search data that wind up having search terms that are going to be — that are going to indicate what a person was thinking about or researching.

 

And then there was also a lot of reporting on period apps. So these are specific apps that collect information about a woman’s menstrual cycle and to the extent that those apps wind up sending the data off the device into the cloud. Then there are concerns, both just general data security concerns about that, but also concerns about how that data might be used maybe in combination with other types of data in order to determine that a woman in a particular state where abortion is illegal missed a period or may have become pregnant and then later had a pattern of location data that suggests they sought an abortion.

 

So all of these types of data sources are useful for some types of — for many types of criminal detection. And in many ways this debate is just an intensification of one that’s been happening for well over a decade now. But I’d say it’s also true that some crimes are more amenable to being detected and tracked in this sort of highly technological data driven way than others. And so maybe seeking an abortion, given the inevitable kind of location changing nature of that service where a woman has to go someplace that might eventually be identified as a place where illegal abortions are provided and stay there for a while. That might have a particular kind of data fingerprint that other types of crimes just don’t wind up having.

 

And so it may be more likely that these types of crimes, if they’re — if they become crimes, are — make use of this data for prosecution. And I should say that most of this data — all non-content data, like, including location data at least might be available to law enforcement using a simple subpoena. So without a warrant. Of course the Carpenter decision recently announced by the US Supreme Court changes that somewhat and puts some of that in question, which we can talk about later.

 

And in response — so just a little more table setting — in response, some companies including Google have taken action. So Google has promised that it’s going to delete data — automatically delete data that it collects related to location tracking when a person visits a medical clinic. I’m actually not sure that does a whole lot in the states that are looking within state for illegal abortions if those services are not taking place at a clinic which — but in any case, I think it’s at least a gesture and maybe more than a gesture to try to remove the data even from sort of a temptation of law enforcement access.

 

I think another big question though that the data privacy debate kind of elides, in my opinion a little too much, is whether there’s really an appetite yet for criminalizing and prosecuting women in particular for seeking abortion. I would have said — maybe four years ago I would have said, “Even if Roe is overturned, there’s just simply not a viable sort of political will to go after and prosecute women as opposed to abortion providers.” But I think that’s changed and the Nebraska case suggests that there has been a shift. In fact, on the campaign trail when Donald Trump was running for president, he made, at the time, what seemed like a gaff, where he asked, “Well, why wouldn’t we prosecute women?” And everyone sort of understood that even the most pro-life and politically active supporters of abortion bans didn’t want to place criminal attention on the women themselves, but I think that has shifted.

 

So then another issue though is whether the criminal law that either has emerged or will emerge in states are going to criminalize only abortions within their own state, which would lead to one set of questions about what types of data are useful. Or whether, instead, at least one state is considering criminalizing crossing state borders in order to seek an abortion where it is legal in that jurisdiction. And so that raises not only data privacy questions, but I think interesting criminal law substantive questions as well, whether that’s the type of behavior that should even be within the purview of criminal sanction.

 

Okay. And so then finally, I am not sure though that Nebraska answers these questions that I think are quite important and maybe prerequisites for understanding the risk of data privacy violations. And that’s whether states are likely to both enact laws and enforce laws that most of us would find kind of repugnant and contrary to what the criminal system should be used for. So the Nebraska case I don’t think is actually a great case to hold up for fear mongering because it involves a — well, it involves a mother guiding a 17-year-old to abort her 23-week-old fetus.

 

And so that’s — I think reasonable minds, of course, run the full gamut in terms of the morality of abortion and when the interests of potential life start to equal or even surpass interests in the bodily and economic freedom of the mother. But I think we understand that many people think at some point that happens and 23 weeks is a pretty long time. And so even in Europe this would be the type of abortion in most countries in Europe that would be illegal. And so I think I’m interested in whether we’re likely to see prosecutions of the sort that would make use of this data, and then, if so, how carefully to refine any data privacy precautions we might want to take.

 

Paul Rosenzweig:  Well, that’s a great introduction. Before we go to the substance. Just a quick check with either Stewart or Danielle. Do either of you have any other broadly speaking use case scenarios that you think ought to be in the pot for consideration as potential avenues for data usage in criminal prosecutions relating to abortion? Stewart?

 

Stewart Baker:  I think that pot is way overfull with dumb use cases already. So I have no more to add.

 

Paul Rosenzweig:  Danielle?

 

Danielle Citron:  You mean so — when I think of the —

 

Paul Rosenzweig:  I’ll give you an example.

 

Danielle Citron:  Yeah, because there are use cases I could make up —

 

Paul Rosenzweig:  Tower dumps near abortion clinics as a potential way that — tower dumps, for those who don’t know, are basically polling location data from a cell phone tower near a location of a crime. It would be used for example in a bank robbery, possibly. And one could imagine the potentiality. Any others — I’m just trying to set the table before we all go talk about what we think about this.

 

Danielle Citron:  Right because I think it’s important to recognize that law enforcement — state, federal, local — have contracts with data brokers, location data brokers in particular, but also data brokers that have 3,000 data points on each and every one of us including whether we terminated pregnancies — like the microscopic level of detail, and it requires no subpoena or warrant. They have contracts ongoing right now. So I think the idea that we’re going to in all cases have interesting Fourth Amendment questions, unfortunately due to the third-party doctrine, and I hope courts come back and we start rethinking, especially as to geolocation, how the interplay as Jane was saying with Carpenter — it’s just the reservoirs of intimate information are overflowing and easily accessible to law enforcement for —

 

Paul Rosenzweig:  Did we just lose Danielle?

 

Jane Bambauer:  I believe so. I can’t hear her and she’s frozen too.

 

Paul Rosenzweig:  Yeah. Okay. Well, those are the vagaries of Zoom discussions. So hopefully, we’ll get her back. Perhaps one of the people like Graub or Steven can try and reach out to her on that regard. So let’s continue the discussion with those who are present. Danielle.

 

Danielle Citron:  Hi. Sorry about that. I don’t know. I’m at the law school too. I thought this would be the best place for it to be stable.

 

Paul Rosenzweig:  Okay. Well, let’s move on —

 

Danielle Citron:  Okay.

 

Paul Rosenzweig:  — to the substance and let me come back to you Danielle while I have you on. If I were to kind of broadly characterize levels of concern, my pre-panel understanding is that your levels of concern are significantly higher than Jane’s and certainly much higher than Stewart’s. So we’ll come to you next, Stewart. So put a little salt on the tail. Tell us why you’re concerned. Riff a bit on the issues that are raised in your book, which I hope to get a copy of when it comes out. And tell us why you think that this intersection is particularly troubling.

 

Danielle Citron:  Yeah. So my work focuses on intimate privacy. That’s the privacy around and the access to and information about our bodies, our health, our closest relationships, our sexuality, our sexual orientation, our sexual activities, and our innermost thoughts, which we document of course with our Google searches and all of our communications. And by my lights, intimate privacy is a foundational privacy value that deserves special protection because its intimate privacy is the precondition for human flourishing. It enables us to kind of figure out who we are, to develop our authentic selves, and that’s with people. That’s not to say it’s alone we’re sort of experimenting but with trusted others. It enables us to enjoy self-respect and social respect so that we’re seen as full people rather than just our vaginas or the fact that we have a high likelihood of developing type 2 diabetes. We can be seen as whole and fully integrated people.

 

And Charles Fried said it best, “Privacy is the precondition. It’s the oxygen for love.” That’s how we get to know each other is that we reveal things to one another we wouldn’t reveal to other people. We forge — it’s that reciprocal vulnerability that allows love to happen. And so because in so many ways individuals, governments, and the corporate sector, we are collecting hand over fist information about our most intimate lives, woefully under protecting it in ways that I think have long stream downstream effects that we just can’t wrap our heads around that have to do with discrimination, scoring, ranking, rating people, jobs you don’t get, insurance premiums that go up that you don’t realize. And this is especially true for women and minorities.

 

And so just — so I’m just going to go back to Jane’s discussion about your skepticism, Jane, about the political will. We’ve seen — so I’m just going to draw from Michele Goodwin’s book on policing the black body. That, over the last nine years, we have seen over 400 prosecutions of black women and girls during pregnancies for having either taken drugs during pregnancy, and so using fetal homicide laws that were really not meant for that. It’s meant for when someone kills a mother and the baby. So I do think that we’re going to see, for the most vulnerable, prosecutions that may not affect the more privileged folks, but I think we are going to see unfortunately — I guess I worry, Jane, about the political will even in my own state at the moment — about the kinds of prosecutions we might see depending on the roll out of these state criminalization laws.

 

And so we see companies as they’re the data handmaidens of government and individuals too. So individuals — you’ve got an ex who’s angry at a lover, tells law enforcement “I think my ex had an abortion. Here are the texts she sent me. Here’s — I have spyware on her phone that says exactly what she’s doing.” And — opened a door to the surveillance of intimate life that — we can close it. So I’ve been working with folks on the American Data Protection and Privacy Protection Act and My Body My Health. There are various innovations happening on the hill. We’ll see what happens. But we need special protection for intimate information. And so it’s so essential to figure out who we are that I hope that we see developments on the hill come to something. I hope that was helpful Paul.

 

Paul Rosenzweig:  Okay. Well, it’s helpful. Give me a little more specifics. What are the — you’re obviously concerned about the growth of a prosecution potentiality. That’s kind of — I mean, that’s a political question that’s not really tied directly to the data issue. Let me ask it this way. Are your data concerns about abortion unique to abortion or do you see the same concerns about the whole panoply of intimacy privacy that you’ve defined, whether it’s transgenderism, marriage, sexual orientation, whatever?

 

Danielle Citron:  Absolutely. Right. Because I wrote my book —

 

Paul Rosenzweig:  Is there something unique about abortions, I guess?

 

Danielle Citron:  Totally. And, you know, it’s interesting and you’re so right. That is, I can’t dictate what state legislators are going to do, and it’s in their judgment and assessment that it’s going to be criminal law. With a warrant, you would think this is perfectly appropriate. And so — and the Supreme Court said, “Look, there’s no constitutional right — at least in the United States Constitution — right to — there’s no right to privacy or right to bodily autonomy that they so recognize in the Fourteenth Amendment’s due process clause.” So — but my — so I wrote my book and have been working on intimate —

 

Jane Bambauer:  Oh bummer.

 

Paul Rosenzweig:  It was going to be an interesting point. I’m sure.

 

Danielle Citron:  My concerns remain —

 

Paul Rosenzweig:  Here we go.

 

Danielle Citron:  Are we back?

 

Paul Rosenzweig:  Yeah. Repeat the last two sentences.

 

Danielle Citron:  Of course. I’m sorry. So I wrote my book before Dobbs came down and before the leak of the draft. And so my concerns — let me just give you a few examples because I think they always help us kind of wrap our heads around why I care about intimate privacy. So individuals will deprive each other of intimate privacy, and it’s often men depriving women and girls and children of their intimate privacy. And what I mean by that is there are 9,500 sites right now, many of them hosted in the United States, that hawk intimate images of people without their consent. So up-skirt photos, down-blouse photos, intimate images, so nude or sexually explicit photos. And it’s not always an ex. It could be a deep fake sex video when a woman’s face is morphed into porn.

 

And when that information often is accompanied by your full name and your home address, it’s not only devastating to you personally because you have such incredible fear. Who’s going to come approach you? It changes your sense of safety. You’re constantly on guard. But it changes your economic opportunities. Teachers have lost their jobs because Google searches of their names include non-consensual pornography. It’s true of parochial school students as well as high school principals, and it’s of nurses. It’s every type and stripe of person, unfortunately, can be tormented. And we just don’t have a comprehensive approach to intimate privacy, and we don’t have enough pro bono low-cost counsel. There are laws on the books, and they, unfortunately, stay on the books and not in practical reality.

 

So we have some challenges. And for intimate privacy, when it comes to companies, let’s just take Grindr, for example. Grindr was encouraging people — so it’s a dating app, a hook up app for gay men and bi and trans men. And so the app was encouraging people to — and doctors were thrilled about — public health officials were thrilled — to share their HIV status because then you could have open and clear conversations about protection. It’s like safer sex this way. The Wall Street Journal broke a story that Grindr was, of course, selling, sharing that information including HIV status with third parties.

 

And the fallout was from a lot of people “I’m not sharing this information anymore. First of all, I know I’m open to discrimination. I worry.” And so what could be so pro-social and helpful, the sharing of intimate information, that it could enable safer sex, people are just going to be chilled, and they were chilled in response. So I wanted to give you some examples of how we see intimate privacy is crucial to our self-development, and love, and sex, not just love, but all the ways in which we connect with one another. So we woefully under protect that information because it’s like a notice and choice regime of “We told you so,” in our privacy policy, and we need to do far better.

 

Paul Rosenzweig:  So Stewart, if I could characterize, I hope fairly, Danielle’s point, it’s that there is no intimate privacy in other areas of — no privacy about intimate issues in other areas of the law. And though we haven’t seen a lot of it repurposed in the abortion area, there’s no reason to think that it won’t if a state criminalizes it. And that is the ground for her concern. I know, I’m sure, that you are skeptical of the need for that concern, that you think that this is — I won’t say a tempest in a teapot — but of less significance than Danielle or perhaps even Jane have posited it. Tell us why.

 

Stewart Baker:  Sure. So let me start with what I thought that we were going to be talking about and then move to this other. And I will say you accurately have summarized my view. I once said, when talking about going back into government, that the only job I would take is the chief privacy skeptic job. And since there is no such job, I’ll have to do it from where I sit. The thing I was struck by was the rush to turn Dobbs into a privacy issue and the complete lack of any basis for doing that. It’s bizarre. People said, “Oh well, they will track you to an abortion clinic that’s providing illegal abortions, and then you’ll be in trouble.” But, of course, as we’ve said already, a lot of the focus — most of the focus of these regulatory laws is on the clinics. If it’s illegal to provide an abortion in your state, there isn’t going to be a clinic. There isn’t going to be a location that you can be tracked to. And so the whole notion that you’re going to be tracked to a place where it’s clear you’re committing a crime is, I think, completely without basis.

 

So then people say, “Well, what if you go to an out of state clinic? You could be tracked to the out of state clinic.” And I think the problem with that is it assumes all kinds of things. It assumes that state legislatures are going to be so enthusiastic about expanding abortion prohibitions that they will try to expand them to behavior of their citizens in other states. There isn’t any sign that that’s happening. And so the idea that you’re going to be tracked to another state’s clinic and criminalized for that is also without foundation. And there were five votes as I counted them in the Dobbs decision itself to say, “Oh states can’t do that. That would be a violation of the right to travel or what have you.”

 

So one, we don’t see the laws. Two, we don’t see the likelihood that those laws would be upheld. And third, the appetite for criminalizing the behavior of women who are seeking abortions is really, really limited. There have been some prosecutions of women who got illegal abortions prior to Dobbs either because they abused their fetuses with their drug habits or in the Nebraska case — and there are probably three of these a year — because they got a very late abortion, and then did, as the woman in the Nebraska case did, buried the results, put it into a plastic bag so you’re not sure the fetus might not have died in the bag, dug it up again, burned the body. There are a lot of violations of law there that have nothing to do with abortion that are typically used to prosecute those kind of things, but location data is not really going to be relevant in any or, as far as I can see, any of those prosecutions.

 

So we’re at a point where there is no point in talking about the way in which your location data is going to make you susceptible to prosecution after Dobbs. And what we’ve started to see is this pivot to “Yeah but privacy.” Privacy is really important, and it can be intruded upon by ordinary criminal evidence gathering techniques that use your electronics against you. And since we’re talking about privacy and Dobbs, let’s talk about why we need a big privacy bill. And that takes us to this intimate privacy question. And I haven’t thought deeply about intimate privacy, but it does seem to me that we have real definitional questions there.

 

I want to reveal my HIV status to the people I’m trying to have sex with, so I’m volunteering that information. Do I want that information used to tell me about a new HIV treatment that is now commercially available? Well, probably. Do I want it sold to somebody who’s going to use it to out me in my capacity as a teacher? No. But how do — and the idea of defining that by just saying, “Well, it’s intimate privacy. So therefore, we solved the problem,” strikes me as only opening Pandora’s box, not actually selecting things that we don’t like that we’re trying to address.

 

So yes, you can certainly imagine privacy abuses that we ought to address, but I suspect that we’re better off asking, “What are the harms we’re trying to stop? And what are the good things we will prevent by stopping those harms in a very broad-based way?” And all of those are the same questions we’ve been wrestling with for 20 years on the question of, how should we regulate privacy? And it’s true, the EU said, “We’ll just make it a human right, and we’ll worry about defining it later.” I think we’re getting close to that with the label “Intimate Privacy.” Let’s make it a human right. And then we’ll worry about the details later. I think we’re better off asking, what are we trying to do? And how would we regulate it in a way that will achieve our results without doing damage to values that we care about?

 

Paul Rosenzweig: So Stewart, let me follow up. It strikes me from this first set of rounds here that I had maybe expected in this discussion to address the privacy implications of Dobbs kind of abstracted from other intimate privacy problems. But it seems to me that the consensus in this group is that there really is no difference in a legal framework type format between intimate privacy relating to Grindr and disclosure of HIV status and intimate privacy relating to abortion status. So assuming that’s a fair characterization, maybe you’ll say no. But assuming that’s a fair characterization of where everybody is, is your lack of concern then based on your predictive judgement that the criminalization of abortion — criminalization of women for having an abortion is just radically unlikely to become commonplace?

 

Stewart Baker:  Yeah. And I think if it does become commonplace, we can have this panel again in six months or a year, but it seems like a very weird thing for everybody to be talking about Dobbs and not talking about what limits are appropriate on abortion and instead talking about what we ought to do about tech data. And it’s a very, in my view, very unfortunate focus, this moral panic about the location data because it has produced — Google has said “We’re not going to collect information on visits to abortion clinics.” So if somebody goes to an abortion clinic, as has happened in the past, and assassinates somebody who’s providing abortions, we’re not going to have the usual information we would have about who did it because Google will have destroyed the data about the visits to the clinic. It’s not serving any purpose to protect women, but it has all kinds of unfortunate secondary consequences.

 

Paul Rosenzweig:  So let me circle around and come to you Jane and ask you, first, do you share Stewart’s predictive judgment about the future? Second, is that relevant to the data privacy issue we’re talking about at all, or should we treat it as an abstract legal issue or not abstract but a real legal issue?

 

Jane Bambauer:  Yeah, I think I’m going to try to square the circle because I think — so first of all, I don’t totally share Stewart’s prediction. I think, like him, I think it’s unlikely that at least a lot of states are going to not only put a criminal statute of this sort on the books, but also vigorously enforce it. However, I’m softening on that prediction. There are powerful political minorities that are really running the show in at least the Republican Party that might want that sort of law and might want to see active enforcement.

 

And so I think the data privacy question is related to the substantive criminal law question but not in the way that I’ve heard expressed so far. So let me try to do it. I think the growing anxiety about the amount and detailed nature of the data that is currently available with some level — maybe low level of process from law enforcement — really forces us to focus on a potential world where we can actually detect all these crazy crimes that we have on the books. I mean, some of the crimes obviously aren’t crazy.

 

And so when I hear — when I read pieces like Danielle’s in Slate, which was excellent, but when I read it as using the abortion prosecution as a reason to worry about data privacy, I just automatically though think of all the examples that Stewart — some of which Stewart raised — like, well, the same data, the exact same data is useful for child abduction, finding child sexual abuse, all of these are — intimacy is a source of great social welfare, but also great threat and pain. And crime in general is a real disruption of not only intimacy, but autonomy and health, and we should care a lot about the crimes that are bad.

 

And so it raises — for me, it raises the question, I think data privacy should force us to focus not on the collection phase of data. There’s just too many valuable uses for this data and that is not going to stop. It’s only going to intensify in my opinion. We also probably shouldn’t focus unduly on data transfers per se. But we need to start thinking about the implications of uses and whether that implies that something else in our criminal system is actually malfunctioning, whether we’re actually over punishing too many things, whether actually, we need to think about constitutional rights not only related to the Fourth Amendment, but even maybe related to due process limits on what types of criminal laws can be brought out in the first place.

 

I mean, otherwise these are the same sources of data that I think would be used. If Danielle gets some of the enforcement that she wants for like criminal violations of harassing women by creating porn with their faces, the same data I think is going to track, well who did it? Who was online at this time and posted that picture? So I think the data privacy problem, rather the policy solutions, are going to be more tricky than, as Stewart said, more tricky than stating an interest in defining intimate privacy.

 

Paul Rosenzweig:  So let me finish this round here — and then we’ll change to another topic — by going to you, Danielle, and say Jane has essentially said that it’s not the collection or the transfer, it’s the use or what I sometimes have written about as privacy as consequence and that the same data that you’re worried about in revealing intimacy might also reveal invasions, inappropriate, illegal, criminal invasions of intimacy. Do you accept that framing? And so are we just talking about implications of use? Or do you think that privacy is a stronger read — read is the wrong word — stronger thing that needs to be independently kind of considered?

 

Danielle Citron:  So I guess, unsurprisingly, I’m going to disagree with my beloved friend, truly who I do love, Jane. There is information that our tools and services simply do not need to collect, keep, process, and store. So if you’re using a tool or service and they don’t need to know and to store details about your sex life, details about your preferences for certain individuals, your health information, if they don’t need to store that information for providing you that service, they shouldn’t collect it and they shouldn’t store it. The risks are — it’s true they’re long and downstream. They often will impact women and minorities, sexual and gender minorities as well as racial minorities. The information that is ultimately then sold, shared, reshared, repurposed, and included in digital dossiers has a huge impact of people’s life opportunities including their ability to get a job, to get an interview, their life insurance premiums. That is, there are real consequences in the here and the now. And so I don’t think we should collect it, and it makes the security issues less complicated. If we don’t have it, we don’t worry about leaking it and in the hands of hackers.

 

So I do think that collection should be on the table as part of the conversation though of course, and I might just disagree a bit about the kind of information that you need to find a harasser, stalker, intimate privacy invader including IP addresses. Is that intimate information? Is that information about your health, your body, your sexual activities, your reading habits? Not precisely. It wouldn’t be intimate information. So I think that is, it’s not that it’s easy. I’m going to disagree with Stewart that you can’t define it. I try to define it as clearly and precisely as I can in my book and in my work. We can define intimate privacy and what we mean by it.

 

And I don’t think that — I think the idea that — the moral panic idea that Stewart raised — just look to pre-1973. Women were self-aborting. They were engaging in practices where people were providing help and getting abortions. So you said, “This will never happen. There will be never in a state in which it’s illegal after a certain amount of time.” I think that’s absurd. We have a huge amount of history that shows us that we — that women and girls will seek out — pregnant people will seek out the possibility. And there are people who want to provide. So it seems to me that there’s a whole lot of harm — that the Supreme Court totally ignored women’s living reality and the harms and there are harms that — I think it’s not a moral panic that we ought to wrestle with. And it’s not that Dobbs makes me think that intimate privacy matters. It’s not why, but it’s a part of it now.

 

Paul Rosenzweig:  Okay. So let me transition to this. And before I do the transition, let me say to all those who are participating online that we have been at this for 45 minutes. So probably in a few minutes, I’m going to turn to your questions. Drop them in the Q&A on Zoom if you will. And I don’t know how the people on C-SPAN can ask questions. I apologize for that. Maybe there’s a methodology, but nobody told me that.

 

So Stewart, let me turn to you and ask you this question. And I want to just kind of dig down a little deeper. We’ve been talking about data privacy as kind of an undifferentiated set of technologies of everything from private messages on Facebook and period apps to location data and, in my hypothetical, geofenced tower dumps. Do you think that we can or should be able to differentiate between those? Would you, for example, agree that perhaps private messages on Facebook are different than location data? Or do you see it as all an undifferentiated mass?

 

Stewart Baker:  No. Of course it’s different. Although, weirdly, if I understand Danielle and some of the others who are talking about intimate privacy, they’re treating location data as intimate data when, in fact, it’s rarely — sometimes of course it’s going to be very intimate. And the very same data which I care not a bit about Google having for maps, I would care a lot if somebody was encouraging a physical attack on me and publishing my home address. You can’t say the data is intimate. It’s the use that, as you said, it’s put to that raises questions about whether it should be regulated.

 

So I do think the more we look at particular cases, the more we’re going to say, “Well, we really can’t just say the data needs to be characterized.” I remember it used to be — people used to say, “Well, the obvious one is health data. What could be more intimate than that? We should treat that as sensitive data.” And then I quite vividly remember when the Swedish Data Protection Authority imposed a fine on a church newsletter for wishing a guy who had broken his leg that he “Get well soon,” because it was disclosing intimate health data about this person without his permission. So there’s just almost no way to separate these things from context and treat them as special to my mind.

 

And then comes the question, do you need this data? Well, surely if there’s anything that we could say you need the data, you have services that need the data, its location. Everything from maps to finding nearby businesses, etc. etc. We’re going to give that up with enthusiasm. And if we say, “You can’t use it except to provide the service,” you’re essentially saying using data to serve ads is out of bounds. And yet most of the things that we value on the internet are intimately, if I can say it, tied to the use of some of the data to serve ads. And in many cases, we are quite enthusiastic about that. We want to know about products that we’ve been looking for or talking about or using when we get online. So unless we’re going to totally transform the internet economy and downgrade large numbers of apps and services, we can’t just casually say, “If you don’t need it to provide that service, you can’t have it.” And so again it seems to me we are courting massive consequences on the basis of anecdotes.

 

Paul Rosenzweig:  So you sort of ran back to location services though. And so I was trying to get you to answer at least for me whether or not you think private messages on Facebook would maybe be in a different category or — which is the Nebraska case — or —

 

Stewart Baker:  Yeah. Absolutely. I mean, I always thought it was creepy that Google was reading my mail. And I think they still are, but they’re not at least serving me ads based on what’s in my email. So sure, the communications — look, in US law, they’ve been separated out and given special treatment even as to government searches for at least since the 1980s and probably since the ’60s. We all recognize that the things we communicate are much more deserving of protection than facts about us.

 

Paul Rosenzweig:  Okay. Well, we’re starting to get some really good questions in here. Let me ask anybody on the panel a simple factual question. One of the members asks “Are there states that are actually attempting to pass legislation that would allow for the prosecution of women for having an abortion? I suspect either Jane or Danielle knows an answer to that. Either of you?

 

Jane Bambauer:  Well, I thought the answer was “Yes,” but I thought it was — I don’t know if Oklahoma was considering. Does that sound right to you Danielle? I’m so sorry. If someone even in the Q&A wants to put in a link to a news item or something that would probably —

 

Danielle Citron:  Totally. I think the New York Times has a running — some media outlet has a running link of the proposed laws and the laws that have been triggered. But I think that’s right. There are laws that would apply to women and not just providers.

 

Paul Rosenzweig:  Okay.

 

Danielle Citron:  Certainly the civil penalties bounty law. Anyone who helps a woman in the effort, or someone who’s pregnant, in the effort to get a pregnancy, there’s civil — the bounty of 10,000 under SB8. So the implications are sort of broader than just the criminal law.

 

Paul Rosenzweig:  Okay. So, and another kind of easy question before I get to a more deep one. Somebody asked, and I assume the answer is probably, “Yes,” that everybody on the panel would think that the same sets of concerns might apply to people who want to go to crisis pregnancy centers, which for those who don’t know, are actually centers intended to encourage women to keep their children and not abort them. Does anybody think that whatever it is we think about this, the answer would be different for crisis pregnancy centers than it would be for abortion clinics?

 

Jane Bambauer:  Well, I don’t see the threat of criminal prosecution there, but in terms of the data that is at least theoretically accessible. I don’t see a difference.

 

Paul Rosenzweig:  — that’s a fair —

 

Stewart Baker:  We’re going to see regulation of those — more aggressive and more punitive regulation of crisis pregnancy centers, but the political valances will be switched. It’ll be the blue states that are saying “We don’t want these people to exist, to be online, to be able to advertise. We want Google to put cautions on their ads. We think they’re misleading people. They’re really about persuading people not to get abortions and not about helping them with their pregnancy.” And so we’re going to see efforts to regulate them pretty aggressively. But I can’t believe they’re going to punish the people who go to them.

 

Jane Bambauer:  But I actually do think it’s useful for a debate about data privacy, especially in the law enforcement context, to actually hold two use cases in your mind that have opposite political valances. So maybe crisis pregnancy centers isn’t the one, but maybe crossing state lines to go to a gun show or something. We can imagine —

 

Paul Rosenzweig:  I like that one.

 

Danielle Citron:  I think that’s right.

 

Paul Rosenzweig:  — data about where guns are illegal.

 

Jane Bambauer:  Right. So you want — whatever theory of data privacy management you have should not be “Well, we should have privacy when I don’t like the law, and we shouldn’t have it when I do like the criminal law.”

 

Paul Rosenzweig:  Although, that is really where most of the privacy debate is. It’s driven by people —

 

Jane Bambauer:  Well, it’s kind of also where human nature is too.

 

Danielle Citron:  I’d keep us with the crisis pregnancy centers because I think it’s the — revealing about our decisions about our reproductive lives and health versus guns which, of course, is inscribed and now understood as an individual right to bear arms. But it’s not intimate privacy. But it’s such a great and important point, Jane, that you make, that we should have opposing use cases in our heads to make sure we’re shorning it of politics.

 

Paul Rosenzweig:  So let me offer one further example — marijuana use. It’s legal in some states. It’s a crime in others. We really have had a surprisingly low number of states trying to prosecute people from their state who go to other states and get high illegally. Even the federal government which makes it still a felony in the state where it’s legal has not aggressively prosecuted people there. I’m just not sure that with some of these hard issues where everybody, people of good faith, have very different views, whether even the people who want it to be criminal want to take it so far as to say, “If it’s legal where you do it, we still want to penalize it.”

 

Danielle Citron:  I think the fetus and fetal personhood take just changes that example entirely because — smoking pot, you’re doing harm to yourself. You could be doing harm to others, but the Millian Principle — when it comes to pro-life folks, they think it’s a person. So harm to yourself — getting yourself addicted to drugs is not affecting a life that the pro-life story would have. Just a bit different.

 

Stewart Baker:  I am very conscious of the fact that for 50 years this was a debate that was polarized. One side didn’t have to show up, and the other side could say anything it wanted. And now, everybody has to have a view. And most of the views that have been expressed in polls are really quite middle of the road. And it’d be very easy as a politician now, I’ll say as a conservative politician, to get burned by saying, “I can say anything because I’ve always said anything.” That’s just not the case anymore.

 

Paul Rosenzweig:  Well, this is great. I have many more questions here, but we’re kind of coming up on the top of the hour. So let me pick one set here that is actually a really good tripart because the questioner asked questions of all three of you. So I’m going to read all three, and then I’m going to go in the opposite order of which I read them.

 

Stewart, to the extent that there are things we communicate — that things we communicate are facts about us, don’t you think that all aspects of private communications whether health or otherwise would be characterizable as private?

 

Danielle, can you give us an example of private data that could be at risk of being revealed post Dobbs that might be something a conservative voter would balk at and react to?

 

And then, Jane, can you frame your reasoning for why privacy matters less than, in this person’s perception, hypothetical future crimes?

 

Do those make sense to you?

 

Jane Bambauer:  I’m good.

 

Paul Rosenzweig:  Okay. Go for it, Jane. You get to go first.

 

Jane Bambauer:  I think my critique of using Dobbs to try to shape the privacy debate is that we’re asking privacy to do too much. Basically, most of the concerns that we’re raising, at least with respect to prosecuting women who try to seek out an abortion, stem from the fact that either we don’t think that abortion should be illegal at all — I think that describes a good number possibly a majority of people in this country — or if it’s illegal, it should be a regulatory issue, not rising to the level of actually putting someone in jail.

 

And given that that’s my instinct about what’s driving this debate and this anxiety, I think we should place the focus on that point, on what should the bounds of the criminal code be? How do we know what sorts of things should be merely regulated versus actually immoral and having externalities that are so bad that we want to actually put a person’s life on hold and incarcerate them. And if we did that, and we had good answers to those questions, and we were able to monitor the criminal code, at least some, not all, but some of the privacy problems would be resolved or diminished.

 

Paul Rosenzweig:  Yeah. It does strike me that, to some degree, a lot of this is about over criminalization of things that we generally think should be — I mean, Stewart mentioned marijuana. I mean, every one of the examples that you all have come up with pretty much sound in the, we’re not even sure they should be criminal. Danielle? An example of private data that could be at risk that would freak out the conservatives who are listening.

 

Danielle Citron:  Okay so I think we should go back to Warren and Brandeis. The example that they give about — the husband writes a letter to the son explaining that he’s not having lunch with mom, but what’s in that letter, even if it’s quite prosaic and boring, they say, “No one should be reading that letter except for if you want other people to read it.” And so think of your Alexa Echo in your home collecting data and storing it into the cloud. Those conversations you have, whether in the bathroom or the bedroom, I feel like people are going to be with me even if — it’s very Warren and Brandeis, “What shall be whispered in the closets shall not be shouted from the rooftops.” I feel like I could get folks on my side for that. Maybe I’m wrong, but I’m hoping I could get some adherents in the conservative side for me.

 

Paul Rosenzweig:  Okay. So Stewart, the guy — you’re basically asked to defend the extreme end of your view.

 

Stewart Baker:  Well, I — then I’ll join Danielle. I think I said that we’ve always treated communications as more private, and we provided more legislative support or protection for communications. It’s not that every communication is a matter of intimate privacy. In fact, I’m guessing that no more than two a week of mine come close. But it’s enough that treating the categories separately is acceptable. It’s a reasonable way to address what’s a quite reasonable concern about how disastrous it would be if everything we ever said in a communication was available to everybody. So that said, we have exceptions. If you’re talking to somebody who has willingly agreed to wear a wire, you got no privacy at all. And if you’re talking about a crime and the government has probable cause to believe that your communication was about a crime, you have no privacy. So we have exceptions to a lot of those rules but starting out with communications as broadly speaking more protected than other things makes sense to me.

 

Paul Rosenzweig:  Well, I see that Steven has rejoined us live, which is the visual cue that our time is up. So I will say thank you and then he will say thank you. My thanks are to my friends Jane, Danielle, and Stewart for participating in this conversation and for a really engaging, thoughtful, and civil discussion which is so rare these days. I’m overjoyed to have been able to participate with you. Steve?

 

Steven Schaefer:  Yeah. Thank you to all of our panel of experts for sharing your insights today and thank you to the audience for joining us. For more content like this, please visit regproject.org. That is regproject.org. And thank you to everyone. Goodbye.

 

[Music]

 

Conclusion:  On behalf of The Federalist Society’s Regulatory Transparency Project, thanks for tuning in to the Fourth Branch podcast. To catch every new episode when it’s released, you can subscribe on Apple Podcasts, Google Play, and Speaker. For the latest from RTP, please visit our website at www.regproject.org.

 

[Music]

 

This has been a FedSoc audio production.

Stewart A. Baker

Partner

Steptoe & Johnson LLP


Jane Bambauer

Dorothy H. and Lewis Rosenstiel Distinguished Professor of Law

The University of Arizona James E. Rogers College of Law


Danielle Citron

Jefferson Scholars Foundation Schenck Distinguished Professor in Law and Caddell and Chapman Professor of Law

University of Virginia School of Law


Paul Rosenzweig

Professorial Lecturer in Law

The George Washington University


Cyber & Privacy

The Federalist Society and Regulatory Transparency Project take no position on particular legal or public policy matters. All expressions of opinion are those of the speaker(s). To join the debate, please email us at [email protected].

Related Content

Skip to content