Deep Dive Episode 237 – Private Rights of Action in Data Policy Settlements
A private right of action, or the ability of individuals to bring lawsuits for violations of a statute, has been a major point of contention in debates over a potential federal data privacy law. This podcast featuring Andrew Kingman (Mariner Strategies), Jennifer Huddleston (NetChoice), and Keir Lamont (Future of Privacy Forum) will dive into the questions surrounding this debate. Is the litigation risk from a private right of action harmful to innovation and small businesses or is it necessary to redress individual concerns? What can policymakers and practitioners learn from state level privacy laws like Illinois’ Biometric Information Privacy Act (BIPA) and the California Privacy Rights Act (CPRA) about the impact of a private right of action?
Although this transcript is largely accurate, in some cases it could be incomplete or inaccurate due to inaudible passages or transcription errors.
[Music and Narration]
Introduction: Welcome to the Regulatory Transparency Project’s Fourth Branch podcast series. All expressions of opinion are those of the speaker.
On September 29, 2022, The Federalist Society’s Regulatory Transparency Project hosted a virtual event titled “Private Rights of Action in Data Policy Settlements.” The following is the audio from that event.
Chayila Kleist: Hello, and welcome to this Regulatory Transparency Project webinar call. My name is Chayila Kleist, and I’m Assistant Director of the Regulatory Transparency Project here at The Federalist Society.
Today, September 29, 2022, we’re excited to host a panel discussion entitled “Private Rights of Action in Data Policy Settlements.” Joining us today is a stellar panel of experts who bring a range of views to this discussion. As always, please note that all expressions of opinion are those of the experts on today’s call, as The Federalist Society takes no position on particular legal or public policy issues.
Today, we are pleased to have with us as our moderator, Jennifer Huddleston, who’s a Policy Counsel at NetChoice, where she analyzes technology-related legislative issues at both the state and federal level. Portfolio and research interests include issues related data privacy, antitrust, online content moderation, including section 230, transportation innovation, and the regulatory state. I’ll leave it to her to introduce the rest of our fantastic panel.
Throughout the panel, if you have any questions, please submit them through the question-and-answer feature so our speakers will have access to them when we get to that portion of the webinar. With that, thank you for being with us today. Ms. Huddleston, the floor is yours.
Jennifer Huddleston: Thank you, Kayla, and thank you to the Regulatory Transparency Project for hosting this and so many great conversations around data privacy and data security, which are certainly growing topics of interest throughout the last few years.
As was mentioned, my name is Jennifer Huddleston, and I serve as Policy Counsel with NetChoice, and I’m very excited to have a lively conversation today on what has been, really, one of those big questions when it comes to what should potential data privacy laws look like. And that is the question around, “What, if anything, should policymakers consider when it comes to a private right of action?”
I’m joined today by Andy Kingman and Keir Lamont. Starting with Andy; Andy is the President of Mariner Strategies, and among other clients, he represents the State Privacy and Security Coalition, which works on data privacy and cybersecurity issues in all 50 states. As a public policy advocate with experience in compliance, he brings a unique and substantive perspective to discussions on how to best increase consumer privacy protections while maintaining the operational workability and cybersecurity protections for businesses. He is nationally recognized as a thought leader in the field, and in 2020, he was named as one of 25 attorneys in Massachusetts’ Lawyers Weekly “Up and Coming Lawyers” list.
I’m also joined by Keir Lamont. Keir serves as a Senior Counsel with the Future of Privacy Forum’s US legislation team. In this role, he supports policymaker education and independent research analysis concerning federal, state, and local consumer privacy laws and regulations.
Previously, he held positions with CCIA, the Computer Communications Industry Association, and the Program on Data and Governance at Ohio State University. Keir has a JD from Georgetown University Law Center and a B.A in Political Science and Economics from the University of Florida.
So before I turn it over to our panel—as I think it’s important to understand where this debate currently is when it comes to data privacy and private rights of action in data privacy legislation — we’ve seen over the past few years that the question of enforcement is often a key question at both a state and federal level. One of those enforcement options put forward often is a private right of action or the right of individual consumers or classes to sue for potential violations of a state or federal law. This has had many critics as well as many proponents.
Critics often point to the fact that the American litigation system has resulted in, oftentimes, an over-litigious approach that can result in negative consequences of a private right of action, particularly with statutory damages. We can look to the examples in Illinois with the BIPA, the Biometric Information Privacy Act, that has resulted in significant litigation over issues such as photo tagging as well as amusement park annual pass identification, oftentimes leading to, even at its most, consumers not having access to certain technologies as we saw with the Google Art Selfie Match.
On the other hand, proponents of private rights of action often question whether or not there are enough resources from government agencies to support active engagement around the issue of privacy.
So with that, I would like to turn to each of our panelists with an opening question with regards to how they see enforcement and how they see a private right of action playing into this. Is a private right of action an ideal enforcement mechanism for data privacy legislation, or if not, what would you say is the preferred enforcement mechanism? And do you think there is a need for any guardrails on private right of actions or any best lessons learned, looking at what we’ve seen with existing legislation? And I’ll start with Andy.
Andrew Kingman: Yeah, sure. Thanks, Jennifer, and thanks, everybody, for taking a few minutes here this afternoon to chat about this. Yeah, Jennifer, I think we’ve seen an interesting consensus form over the past several months, particularly at the state level, which is where I focus. Since the California Consumer Privacy Act first passed in 2018, we’ve seen four states subsequently pass privacy legislation, and although those four states are working from a different framework than the California model, what we’ve seen is that there is consistency in at least one aspect of that legislation and that is that there is no private right of action. Now in the California legislation, there is a limited private right of action for data breaches but not for privacy violations.
I think the fact that we’ve seen states as diverse as Utah and Connecticut pass legislation with similar enforcement mechanisms, both with almost universal bipartisan support, really shows that there’s a consensus that’s formed that private rights of action are not the best way to enforce privacy violations. And there are a few reasons for that, I think. The first is that privacy violations are very difficult to prove actual harm, and the second is that because of that, it is really ripe for abuse, not by consumers but by the plaintiff’s bar, who can use vague or very complex requirements in a law to leverage literally millions of dollars in eDiscovery costs from defendants in order to get to a settlement.
So I think the fact that all five states with a comprehensive privacy law and two out of the three states with more specific biometric privacy laws — I think there’s a pretty clear policy preference that has been expressed there.
Jennifer Huddleston: Great. And Keir, your thoughts on the same question.
Keir Lamont: Sure. So thank you, Jennifer and RTP, for having me on. So look, private rights of action are one of a suite of potential enforcement mechanisms that can be included in privacy frameworks. My organization tries to take a holistic, case-by-case approach to analyzing privacy legislation, and we don’t have any one answer to the question of whether any particular element of a proposal is or is not ideal.
Now, Andy is absolutely correct about the emerging state consensus against including private rights of action in privacy legislation; however, I think it is also important to note that as the current federal political climate stands, a broad yet tailored private right of action has emerged as part of the compromise that can potentially get policymakers in Washington to, at long last, enact federal privacy legislation.
This summer, we saw bipartisan, bicameral work in congress to advance the American Data Privacy and Protection Act, or ADPPA, highlighted by an overwhelming 53:2 vote to advance the proposal out of the House Energy and Commerce Committee. ADPPA includes a private right of action, and I can tell you those two nay votes had nothing to do with its enforcement provisions. So at this point, I’m not sure how tenable it remains to try to hold a hard line against any sort of private right of action in federal privacy legislation. The center of gravity in Washington has shifted dramatically over the past four years.
So, potentially, another way of answering your question, the attributes of federal privacy legislation, at least, that could be considered ideal, may be the ones that are necessary to finally get federal privacy legislation over the finish line. The simple fact of the matter is that a federal privacy law is sorely needed. Individuals in 45 states lack baseline privacy rights, businesses are facing an increasingly complex patchwork of state laws, and the lack of a privacy law is undermining America’s leadership and global economic competitiveness.
I would also like to stress that a private right of action should not be viewed as an all-or-nothing proposition. As you allude to, Jennifer, there are many guardrails and tweaks that can be made to a private right of action in order to protect consumers while mitigating the risk of a deluge of nuisance lawsuits that Andy references. The current version of that federal proposal, ADPPA, includes many such provisions in its private right of action, and I will name a few.
The right of action wouldn’t be available until two years after the law would take effect. There would not be statutory damages for their procedural violations of the Act. It would only permit action in federal courts, not state courts, raising some very interesting Article III standing questions. In many cases, it requires persons looking to sue to provide prior notice to both regulators and the party alleged to be in violation, including an opportunity to cure. And finally, there would be protections for certain very small companies.
So these are all examples of levers that can be set and adjusted as policymakers consider whether to include a private right of action in privacy legislation.
Jennifer Huddleston: Thank you, Keir. And I think it’s often easy for those of us who spend a lot of time in the privacy space and talking about it online—on Twitter and things like that—to very quickly assume that everyone knows exactly what we’re talking about when we start throwing out acronyms like CPRA and CCPA and the Colorado law versus the Virginia law and things like that. And while I know a lot of our attendees are likely attorneys or are working in the law and so are familiar with the idea of private right of action, I’d like us to take a little bit of a step back to talk about what it is that we’re actually talking about.
So, Keir or Andy, could either of you jump in and, for those who aren’t as familiar, speak briefly to, what are the kind of different types of private rights of action we see, and particularly, this question that tends to emerge a lot in the data privacy space of, what do we mean by the difference in a private right of action with statutory damages versus a private right of action for actual damages?
Andrew Kingman: Yeah, sure. I’ll start, and Keir, if you would like to jump in, feel free. When we talk about a private right of action, it’s, effectively, the ability for individual consumers or a class of consumers to sue a company for a particular violation of a statute. And whether a bill has statutory damages, which sets out particular dollar amounts per violation—which as you can imagine, when companies have hundreds of thousands if not millions of customers, can add up very, very quickly—or whether it’s actual damages—and again, going to that issue of privacy harms being very difficult to prove—can be a little bit of a different gauge of the amount of both compensation that a plaintiff could receive and also the amount of litigation risk or regulatory risk that a business is able to take on.
Keir Lamont: I think that’s right, and I would also just note that there are, certainly, different types of private rights of action that we have seen emerge in just the existing privacy laws we have in the books already—for laws like the Video Privacy Protection Act and Telephone Consumer Protection Act, under which any violation is considered a per se harm that permits individuals to challenge in court. You also have laws like the Computer Fraud and Abuse Act or Privacy Act of 1974 that require plaintiffs to demonstrate that the law has been violated and some degree of serious injury has occurred.
You also have laws that establish different intent toward knowledge standards concerning the violation. For example, the Fair Credit Reporting Act has a “willfully fails to comply” standard that provides for higher penalties, if that has been shown. And, of course—as I think Andy has alluded to earlier, with the existing California law—many of these privacy laws contain dozens of both consumer rights and business obligations which may cause greater or lesser risks upon if there’s been a violation, and private right of action doesn’t necessarily need to apply to every particular aspect of a privacy law.
Jennifer Huddleston: I kind of mentioned this in the opening as I was introducing you two, but one of the reasons that private right of action tends to get so much discussion are some of the unique facets of the American legal system—notably, the lack of a loser pays rules, questions around class actions and attorneys fees, and questions about who a private right of action with statutory damages would really benefit—whether this is an enforcement option that is just creating excess litigation or whether it’s something that can actually impact the impacted consumers.
As I mentioned, we’ve seen a lot of concerns about this, particularly in response to the Illinois law that we’ll dig into a little bit later in this panel. But I’m also just curious as to your thoughts on—and I’ll start with Keir this time—how does the American legal system and its approach to litigation differ from other legal systems, particularly the European legal system, and does that impact what impact a private right of action might have?
Keir Lamont: So look, Jennifer, do not let the accent fool you. I am not a European attorney. I am not an expert on litigation in Europe, though I do understand that due to both the legal and cultural differences, there does tend to be less of a litigious climate in Europe. And in the EU, it’s often said that regulators tend to adopt a more principle-based approach and collaborative perspective to industry compliance.
However, I think it is worth noting that within the European overarching privacy law—the General Data Protection Regulation—that law does establish some mechanisms for individuals to take action to enforce their privacy interests. You have Article 77, right to lodge a complaint with a supervisory authority; Article 79, right to an effective judicial remedy against a controller or processor; and Article 80, right to have a non-profit organization lodge a complaint on an individual’s behalf.
So certainly, if there was a broad private right of action in the US space, I think that would operate differently than how we see the self-help mechanisms of European law operate in the four, five, six years now that that law has been in effect. But I think it’s not accurate to point at Europe as necessarily a reason why US privacy legislation should or should not include a private right of action.
Andrew Kingman: Yeah, I think that’s, generally, correct. I’m also not an expert on the European legal system, but I think the “loser pays” model is a really critical component here. We don’t have that, in general, here in the United States. None of the privacy laws that we’ve referenced here in the United States, I think, have a “loser pays” model, and that really can serve as a way of deterring some of the more frivolous or questionable lawsuits. There are some mechanisms in the ADPPA—the federal bill—that, while aren’t a “loser pays” model, do have some threshold questions that need to be answered—or threshold procedural issues—before the plaintiff’s attorneys can institute a lawsuit. So there are some of those, but I think that that “loser pays” model is a very, very critical distinction.
And to Keir’s point, I think in the European litigation system, again, there generally is a much more collaborative spirit with regulators, where entities who may find themselves in violation or have been accused of being in violation are able to have opportunities to remedy that before it gets to a litigation scenario. So again, not an expert, but I think there are some critical distinctions, process-wise, that really require an extra look at Europe as a model or anything along those lines.
Jennifer Huddleston: Well, and to be clear, Keir, I was mainly trying to keep who got to start first fair, not pulling on any accents or anything there. I do want to turn back to the US and to the states, something that I do know you are both an expert on. Andy, in your opening statement, you alluded to the fact that private right of action has — or a movement away from a private right of action has become the trend at a state level. And you mentioned, particularly, that two out of the three states that have passed biometric information privacy act legislation—so some laws that have been around a bit longer than the current, more general consumer data privacy bills—lack a private right of action.
In contrast, the State of Illinois has a private right of action, and as I mentioned earlier, this has been quite a subject of much debate. So for both of you, but starting with Andy, what can we learn about the impact that a private right of action—such as that we see in the Illinois BIPA legislation—has on consumers and businesses, with looking at the experience of Illinois versus the states of Texas and Washington, who passed similar legislation but without this private right of action?
Andrew Kingman: Yeah, thanks. Great question. I think Illinois is sort of the poster child, interestingly, both by the business community and by consumer advocates. Consumer advocates tend to tout Illinois’ law as the strongest privacy law in the country because of its private right of action. The business community tends to look at it as an example of where class action lawsuits have become an absolute cottage industry.
I think there are a few lessons to be learned here. BIPA has some very unique aspects to it. The first being, it was passed in 2008. So less than a year after the invention of the iPhone, there’s a law regulating biometrics. The way that the online ecosystem has evolved does not really match up at all with the structure of that bill. It really, effectively, puts strict liability on companies who don’t comply, but there are any number of companies — as privacy laws have evolved to recognize just the explosion of vendors to the consumer-facing entity, BIPA doesn’t make that distinction. As a result, there are scores and scores of companies who are unable, literally unable, to comply with BIPA’s requirements because they never interact with the consumer. This, in turn, has created what I think is accurate to describe as a cottage industry of class action lawsuits here. These are for relatively — many of these are for relatively minor infractions, and they are extremely, extremely costly.
Jennifer, as you mentioned at the top, I’ve worked not only on the policy side but also on compliance side, helping companies work to comply with various privacy laws. I can tell you, when it comes to Illinois, more and more companies are just looking at Illinois and saying, “Well, we won’t offer our security services there,” right? “So we’re not going to deploy services that might keep consumers more safe because the way that the law is structured doesn’t allow us to operate without getting consent from people we think might be cybercriminals, and we’re just not going to take the litigation risk there.”
So, as a result, there are real effects on consumers. Consumers are not able to avail themselves of the cybersecurity and anti-fraud protections that consumers in 49 other states have. This is a real tangible danger. There are other consequences—other types of services are not provided—but I think the real and most significant effect on consumers in Illinois is that security and anti-fraud sector has really been driven out of the state because they’re simply not able to comply the way that the statute has been drawn up.
And the other piece of this is that consumers often do not see real tangible benefit to these class action lawsuits. In almost every case, the award that they are getting is in the high tens of dollars, so maybe eighty, ninety, up to, maybe, a couple hundred dollars. So this is not a windfall for consumers when there are entities that have been found to violate or who end up settling with the plaintiff’s bar.
And just for some context, in the last five years, there have been over 1,000 class action lawsuits filed. Illinois state courts ruled in 2018 that there was, actually, no concrete injury required to bring a lawsuit under the statute, and that really opened the floodgates for litigation in that state. So we’ve seen it really explode as a center for litigation and really has failed to winnow down bad actors from good actors.
Jennifer Huddleston: Keir, do you have anything to add or any different thoughts?
Keir Lamont: Sure. So I actually might be willing to go a little bit further than Andy on the Illinois BIPA. I think that outside of the plaintiff’s bar or a privacy absolutist, you’re actually going to have a very, very difficult time finding anyone who’s willing to say that the private right of action in BIPA is operating in an ideal manner. However, when we look at BIPA, it was the basis for leading to a major settlement with Clearview AI, involving facial recognition, collection, and processing that many people found extremely, extremely objectionable. And when you look at those two other state biometric privacy laws—Texas and Washington State—I, personally, know of only one enforcement action carried out pursuant to either law. Andy might know more, but I believe that involved conduct that the business in question had already ceased for other matters.
So it’s not clear to me that, in the absence of a private right of action, Texas and Washington State have done that much to advance biometric privacy for the citizens in either jurisdiction. So once again, I think this points to the fact that when we talk about private rights of action, they do not have to be an all-or-nothing proposition, and there are many levers and tweaks that we can pull for policymakers thinking through how to create an ideal enforcement situation.
Andrew Kingman: Yeah. I think, first of all, what I would say in terms of Washington and Texas, many, many attorney general investigations are confidential, and so, I think it’s very difficult to pinpoint or assess the efficacy of those laws based on publicized enforcement actions.
The other thing that I would say is that Illinois is a good example of some of the fundamental assumptions that many proponents of private rights of action make that I just think aren’t true. And I think one of those assumptions is that businesses without a private right of action will willfully disregard a law. And no doubt, with any statute, I’m sure you can dig up some bad actors, but that’s not the point here. The point here is that this is driving away good actors and that it’s not an effective tool to winnow out those who are violating a law egregiously. Although, certainly, many of the practices that Clearview did employ are, generally, — have been found objectionable by lots of folks. That’s one of, as I said, 1,100 class action lawsuits.
Other states have found ways to go at that type of conduct without this type of legislation. So again, I just would question whether that’s, ultimately, the right way of trying to get at that conduct when you are wrapping in so many other good actors and not only driving them away from Illinois but leaving Illinois consumers unprotected.
So that’s where we end.
Jennifer Huddleston: So before I move on to our — my final few questions for the panel, I would like to remind our attendees on Zoom that you can leave questions in the Q&A feature or in the chat feature for our panelists as well.
So building off of this kind of case study—for lack of a better term—that we’ve seen in Illinois versus Texas and Washington but also thinking about what we’ve seen emerging at a state level, while I know many of us are concerned about the potential implications of a state patchwork of privacy laws, as was mentioned earlier, we are seeing renewed federal data privacy conversations, something that is exciting to many of us in this space because I think I can say there is pretty general agreement that a federal framework would be preferable to a state patchwork, even if there is much debate about what that federal framework should actually look like.
Thinking about what we’ve seen at a state level—and I’ll start with you, Keir—what lessons should federal policymakers take away as they consider potential enforcement mechanisms in federal legislation around data privacy?
Keir Lamont: Sure. So of the five state comprehensive privacy laws that are currently in effect — only one is currently in effect, and that is California—the CCPA—and it has been mentioned that the CCPA provides a very narrow private right of action just for data breach litigation. And for all of the other consumer rights, protections, business obligations, the law is very clear, no private right of action—no standing to sue in a personal capacity based on one of those violations.
However, to this point, we have seen dozens, if not hundreds, of lawsuits filed in California citing the CCPA as an attempt at a basis to get into court. It’s been somewhat surprising to me, and I think it really underscores that, in crafting a privacy law, it would be very important to get the scoping right and be very precise with statutory language when it comes to any inclusion of a private right of action because any ambiguity or even any lack of ambiguity, as is the case in California, is likely to be seized upon by both plaintiffs and defendants to try to broaden or narrow the scope of a private right of action beyond the drafter’s intent.
Jennifer Huddleston: Andy, I’ll turn it over to you. What lessons would you like federal policymakers to take away from the current enforcement mechanisms we’ve seen in state laws?
Andrew Kingman: Sure. Well, I think Keir’s point is very well taken. In the first six months of the CCPA being in effect, we saw over 50 class action lawsuits filed, trying to find what we would call a back door private right of action, whether it could be bootstrapped onto other statutes that allow a private right of action; we know that there will be an effort. So I think Keir’s point about very precisely structuring that private right of action—if, in fact, one is to be in federal law—will be incredibly important.
But I think there are two other pieces here that go along with any kind of individual redress. And I do agree with Keir that I think — my perception is that at the federal level, there is more tolerance or willingness to consider some mechanisms for this because it would necessarily just be a single process and system, versus when you’re talking in the states having the possibility of 50 different jurisdictions with 50 different private rights of action.
But there are two points that come along with this that, I think, are really good lessons that feds can draw from the states, and the first is the value of a right to cure, and the second is preemption and the degree of preemption. So as Keir said, California is the only state in effect right now, with its privacy law, and for the last four years — or excuse me, the last two years that it’s been in effect, has had a right to cure. Now that is being eliminated as of January 1 this year, but the California AG released a report in January of this year stating that over 75 percent of the businesses that it had sent a right to cure, led notice to had come into compliance.
What the right to cure is, is a short time period—30 to 45 days—where once the attorney general notices that business—that there’s been a violation or an alleged violation—the business has that 30-to-45-day period not only to fix it but to expressly state in writing that that particular issue will not happen again. And so, it’s beneficial for the attorney general’s office because it’s a relatively low-resource but high-impact way to ensure compliance. And it benefits consumers because the issue that they’ve noticed or that they’ve flagged is resolved in a very short period of time. And it’s beneficial to the businesses because, again, it kind of winnows out the good actors versus the bad actors.
These are very complex laws. They have a lot of, lot of granular requirements, and it’s not difficult to oversee something or to not have prioritized something in favor of a more impactful privacy compliance posture for another requirement.
And so, I think the right to cure has proven very valuable and a very helpful tool that can help offset some of those risks of a private right of action. And I think the other piece that’s absolutely critical here is the degree to which preemption comes into play. Again, I think there’s less incentive to support individual redress if businesses feel like there’s still opportunities in the states to pass legislation regarding the same topic with private rights of action. Then, at that point, there’s really no — it’s difficult to understand what the incentive would be to support that at the federal level. I think this is — those two pieces go hand in hand with—again, to Keir’s point—very carefully, precisely constructed level of individual redress if privacy legislation is going to happen.
Jennifer Huddleston: So one of the issues with private rights of action and statutory damages is that there’s often not a clear harm associated with the potential litigation that stems from the violations, but it can still be incredibly costly both to businesses and—was mentioned earlier—to consumers in the decision of companies whether or not to enter a certain market.
We often think of data privacy as a “tech” issue. We’re often hearing it brought up around social media companies or online behavior. But is this really a tech issue, first off, and if so, are these questions of harm unique to the digital space? And how should policymakers think about the question of harm when it comes to data privacy enforcement?
And I’ll let whichever of you wants to start first. I know that’s a question that is very much in debate in data privacy these days—what is harm—so I don’t necessarily expect us to solve this on the panel, but I’d like to get both of your thoughts on that question.
Keir Lamont: I can lead off. Is this a tech issue? Increasingly, everything is a tech issue because everything is tech. More and more throughout the economy, data is collected, processed, used to enable the delivery of very beneficial services but can also raise risks in unconsenting or harmful processing against consumer interests. So let’s go back to the Illinois BIPA and take that as an example. There are certainly, potentially — under that law, there have been costly settlement requirements and litigation for their procedural violations, for not crossing your i’s or dotting your t’s in a written consent flow. However, when we’re talking about biometric data, we have to recognize that that is one of the more sensitive categories of information. It’s often said, “You can change your password. You can even change your social security number, but it is much, much harder to actually change your face.”
And to many people, having your face scanned without your knowledge or consent—your faceprint lifted from that image and added to a perpetual line-up that can be used to track, identify, or render adverse judgments against you at some point in the future—even without a separate economic harm, that can constitute a very objectionable personal invasion of privacy—a clear harm.
And now, obviously, there are plenty of Article III standing issues and what does or does not constitute a cognizable injury. But however, as we seek to pass privacy laws and support privacy, a major purpose of doing so is to ensure that the tech and broader industry can maintain consumer’s trusts in digital products and services. And I don’t think policymakers should overlook these interests and sensitivities. They must be accounted for.
Andrew Kingman: Yeah. I don’t disagree with that, Keir. And I think privacy is certainly becoming a market force on the business side as well. Businesses know that being more responsive to consumers on their privacy issues can be an advantage in the market as well. So I think that’s true. I think BIPA is interesting. I think when we think about the degree of injury or what standing might look like — actually, taking one of the comprehensive bills in the states is kind of an interesting thought exercise.
Three out of the four bills require opt-in consent for sensitive data, as Keir was referencing with the BIPA legislation. And that sensitive data in those bills are largely defined as precise geolocation information, biometric information, health diagnoses, children’s information, etc. So if a business fails to obtain opt-in consent to process that data, I think that really could constitute something that could be a cognizable harm, depending on how that data was then processed.
Jennifer Huddleston: Well, I thank you both for your time. As a closing statement, I would just ask you both to end with what advice would you have for policymakers — if you could just give them one piece of advice to think about when it comes to this question of private rights of action, either at a state or federal level. Andy, since I let you get the first opening statement, Keir, I’ll let you get the first closing statement.
Keir Lamont: Sure. So, again, I would just have to return to the recommendation that private rights of action should not be considered an all-or-nothing proposition. As I’ve mentioned, there are many different constraints and protections that could be included to allow consumers to seek self-help when there’s a very objectionable violation of their privacy interests, while trying to put the lid on some of these nuisance lawsuits. There’s many great resources out there that try to lay out some of these different levers and tools that can be pulled here, and I would point to the Future of Privacy Forum, my organization, that has tried to do a lot of smart thinking about some of these available options.
Andrew Kingman: Thanks, Keir. I would go back to the criticality of the right to cure. I think at the end of the day, the most important thing is not only letting businesses comply with the law and encouraging compliance over punishment, but to ensuring that you have mechanisms to separate good actors—who have committed an unintentional violation but who are in good faith attempting to comply, again, with a very complex law—and businesses who are — maybe those bad actors who are willfully ignoring compliance with the law. So in my mind, that’s a very, very effective tool that has to be looked at in conjunction, as well as a strong preemption, whether at the state or federal level, to ensure that there is a single enforcement body and a single set of laws and requirements that businesses can comply with.
Jennifer Huddleston: Well, thank you, again, both for your time. With that, I’ll turn it back over to Chayila from the Regulatory Transparency Project for some closing comments.
Chayila Kleist: Absolutely. On behalf of both myself and the Regulatory Transparency Project, I want to thank all our experts for sharing their time and expertise, and I want to thank our audience for tuning in and participating. We welcome listener feedback at [email protected] If you’re interested for more from us at RTP, you can continue to follow us at regproject.org and also find us on all major social media platforms. Again, thank you all for joining us today, and until next time, we are adjourned.
Conclusion: On behalf of The Federalist Society’s Regulatory Transparency Project, thanks for tuning in to the Fourth Branch podcast. To catch every new episode when it’s released, you can subscribe on Apple Podcasts, Google Play, and Spreaker. For the latest from RTP, please visit our website at www.regproject.org.
This has been a FedSoc audio production.
Mariner Strategies LLC
Future of Privacy Forum
Technology Policy Research Fellow
The Federalist Society and Regulatory Transparency Project take no position on particular legal or public policy matters. All expressions of opinion are those of the speaker(s). To join the debate, please email us at [email protected].