Deep Dive Episode 213 – After California and Virginia, What’s Next? Examining the State of State Data Privacy Legislation
Data privacy and data security are tech policy concerns that resonate with many voters and policymakers. In the absence of federal data privacy legislation, some states have passed, and others are considering, their own legislation to deal with data privacy questions related to specific technologies including biometrics, among many others. While California had the first general data privacy law at a state level, Virginia and Colorado passed different laws last year. Now states ranging from Connecticut to Utah are considering data privacy laws often modeled after these two examples. What do these data privacy laws mean for consumers and companies, both large and small? What might the landscape of state data privacy laws look like after the 2022 legislative session?
Although this transcript is largely accurate, in some cases it could be incomplete or inaccurate due to inaudible passages or transcription errors.
[Music and Narration]
Introduction: Welcome to the Regulatory Transparency Project’s Fourth Branch podcast series. All expressions of opinion are those of the speaker.
On March 15, 2022, the Regulatory Transparency Project hosted a virtual live podcast, titled “After California & Virginia, What’s Next? Examining the State of State Data Privacy Legislation in 2022.” The following is the audio from that event. We hope you enjoy.
Colton Graub: Good afternoon, and welcome to The Federalist Society’s Fourth Branch Podcast for the Regulatory Transparency Project. My name is Colton Graub. I’m Deputy Director of RTP. As always, please note that all expressions of opinion are those of the guest speakers on today’s call. If you would like to learn more about each of our speakers and their work, you can visit RegProject.org, where we have their full bios.
After opening remarks and discussion between our panelists, we will go to audience Q&A. So please be thinking of the questions you would like to ask our speakers. This afternoon we’re pleased to host a conversation discussing the state of state data privacy and data security legislation. To discuss this topic, we’re pleased to welcome an expert panel of distinguished speakers.
Jennifer Huddleston will be moderating the discussion. She is currently Policy Counsel at NetChoice. Jennifer, I’ll pass it off to you to get the conversation going in earnest.
Jennifer Huddleston: Thank you Colton. And, as you mentioned, we have two very distinguished guests on this issue who have done excellent work and really help to understand the full landscape of what exactly is going on in the ever-changing world of state data privacy laws.
So I’m joined today by Keir Lamont, who is Senior Counsel with the Future of Privacy Forum. Keir holds a J.D. from Georgetown University Law Center, and a B.A. in Political Science and Economics from the University of Florida. He works with the Future of Privacy Forum’s U.S. Legislation Team, and has worked to support policy-maker education and independent analysis concerning federal, state, and local consumer privacy laws and regulations. Welcome, Keir.
Keir Lamont: Thank you for having me.
Jennifer Huddleston: We’re also joined today by Daniel Castro. Daniel is the Vice President of ITIF, the Information Technology and Innovation Foundation, and Director for the Center of Data Innovation. He has a B.S. in Foreign Service from Georgetown University and an M.S. in Information Security, Technology, and Management from Carnegie Mellon University.
Daniel writes and speaks on a variety of issues related to information technology and internet policy, including privacy, security, intellectual property, internet governments, e-government, and accessibility for people with disabilities. Welcome, Daniel.
Daniel Castro: It’s great to be here, Jennifer.
Jennifer Huddleston: Thanks. And thank you both for joining. We were joking a little bit before we went live about this seems to be a topic that we’re recording at 1:00 p.m. on a Tuesday. By 3:00 p.m. on a Tuesday, more may have happened. We’re seeing more and more states introduce different data privacy proposals. We’re seeing those proposals make it to different points when it comes to the state legislative process, whether it’s being heard in committee or just being introduced or votes on the floor and crossover deadlines.
This really seems to have evolved over the last few years from when we were just talking about CCPA to what seems to be a growing number of states. Most notably, about a year or so ago, we saw Virginia pass a data privacy law that was a distinct consumer protection law from what we had seen previously, modeled on California. We also saw Colorado pass its own data privacy law last session. And now we’re seeing more states consider different proposals, some of which look like these existing proposals, and some of which look different.
So, to start with, Keir, the title of this podcast is “Beyond California and Virginia.” Can you give us a brief description of what the state of state data privacy laws looks like right now, or, at least, prior to this current legislative session?
Keir Lamont: Sure thing, Jennifer. And thanks again to you and to RTP for having me on. Now, given everything that is happening, I am not entirely sure if a brief description of the state privacy landscape is actually possible anymore. But I promise that I’ll do my best. So, for us, the story really begins back in 2018, following the adoption of the European General Data Protection Regulation, and controversial revelations about the use of personal information by the political firm Cambridge Analytica, which drove heightened American attention to privacy issues.
In the absence of comprehensive federal privacy legislation, lawmakers started to work on state-level privacy frameworks that would guarantee certain modern rights and protections for their citizens’ data, such as the ability to access, correct, and delete personal information, and the ability to opt out of the sale of data.
The first state to move was California, which, as part of a political compromise to avoid a ballot initiative, rapidly adopted the California Consumer Privacy Act, or CCPA, in June of 2018. This was a big turning point, because, while there have long been state and federal laws in the U.S. governing the use of particular categories of data, or setting privacy rules for specific industries, the CCPA was the first law to try to set rights and protections for consumer information in a comprehensive manner across the economy.
Now, at the time, many expected that other states would follow in the lead of California and enact similar privacy laws, which is what happened when California passed a breach notification statute back in 2002. However, while numerous California copycat-style bills have been considered in recent years, none have made it over the finish line.
One reason for this may be that since enacting the CCPA, California privacy law has never really sat still. The CCPA was amended several times and went through multiple rounds of rulemaking by the California attorney general. Then, in 2020, California voters adopted a new ballot initiative to strengthen and expand the CCPA, creating the California Privacy Rights Act in its place.
This ballot initiative also created a new regulatory body, the California Privacy Protection Agency, which is entering a new rule-making process that will hopefully be completed prior to the CPRA’s January 2023 effective date.
However, a separate trend emerged in 2019, when lawmakers in Washington State decided to go in a different direction and started working on a distinct privacy law, the Washington Privacy Act. This legislation was generally viewed more favorably by industry, in large part because it borrowed familiar and predictable concepts from European data protection law.
While the Washington Privacy Act has come close to passing on multiple occasions, it never quite made it over the finish line, in large part, due to disputes over enforcement. Stakeholders disagreed over whether to leave enforcement exclusively to the attorney general, to enable private lawsuits, or to create a new regulatory agency. Enforcement issues have a tendency to feature very prominently in the political debates over privacy legislation. And I expect that we will return to this topic.
Then, in 2021, two new states enacted comprehensive privacy legislation: Virginia and Colorado. Each of these laws borrowed significant language and principles from the failed Washington Privacy Act. So while there are important distinctions between the Virginia Consumer Data Protection Act and the Colorado Privacy Act that regulated entities will need to pay attention to, they share many fundamental similarities. All three laws — the new CPRA, the VCDPA, and the CPA — will go into effect in 2023. But note that California and Colorado still need to go through rulemaking processes to issue implementing regulations.
So that brings us up to 2022. With three laws on the books, many commenters in this space have wondered if the dominoes are really set to start falling and for state privacy laws to begin to sweep the nation. Personally, I am too much of a coward to make legislative predictions, but I saw many smart folks on Twitter guess that we could see somewhere around four to six new laws enacted this year. The big question, of course, is what would those laws look like?
In the past couple of years, state legislation has typically followed either a California or a Washington state/Virginia-style approach. But there are two additional privacy frameworks that people in this space should be aware of in order to appreciate the full state privacy landscape.
First, in 2021, the Uniform Law Commission created a model privacy law that is, in some ways, rooted in the Federal Privacy Act of 1974, that distinguishes between compatible, incompatible, and prohibited data practices, de-emphasizing notice and consent. Second, there has been another set of bills that could be called the New York model that would regulate privacy by establishing fiduciary duty-style obligations on companies that hold consumer data, such as imposing duties of data care and data loyalty.
So that should give you a relatively complete overview of the U.S. privacy landscape. And I will pause here because I think that brings us up to the beginning of this year.
Jennifer Huddleston: Thank you very much. As you mentioned, it’s a growing landscape and a growing history. And it’s increasingly difficult to do so briefly. Keir, you mentioned some of the similarities that we’ve seen in these state-level laws. But there also seem to be some pretty significant differences.
Daniel, looking at the three major state-level laws that we’ve already seen enacted in this comprehensive consumer data privacy space, how would you describe the differences between the California, Colorado, and Virginia state data-privacy laws? And what might that mean, in terms of compliance for businesses, and in terms of rights for consumers?
Daniel Castro: That’s a great question, Jennifer. I think a lot of outside observers might look at some of these laws and say, “Well, they’re pretty similar. They’re all about privacy and they’re all about giving consumers certain rights. So what’s the big deal here? It doesn’t really cost companies that much to comply.”
But what we’re seeing is that these laws, although they’re generally trying to achieve the same thing, there appear to be quite some significant differences. So these differences span across different areas. So, for example, there are different thresholds that apply over what size organizations these laws apply to, whether it’s based on the amount of revenue they have, the number of customers they have within the state, whether they sell data for a specific purpose. They also have different exemptions.
So the laws generally apply to all consumer data except for a number of carveouts. So, for example, it doesn’t apply to institutions, generally, that are subject to HIPAA, the health privacy law. It doesn’t apply to institutions that are subject to Gramm-Leach-Bliley Act. But, again, there’s variations between the laws. Some will say, define it very clearly. Health data is exempt, but not others. And that type of variation means that every time one of these laws pass, companies have to take a very close look to see whether they’re impacted by it.
And then enforcement looks different in each of these states. So California sets up a new California Privacy Protection Agency as the chief privacy regulator for the state. In contrast, Virginia relies on the state attorney general. Colorado relies on the attorney generals, plus the district attorneys. There are some differences too, in terms of California created this right to cure, so, the idea that, if there’s some kind of violation, you could — a company could come back and fix the problem. But that sunsets next year.
And so Virginia still does have a 30-day cure. Colorado has a 60-day cure period. And then, for consumers, this is where there’s probably a lot of similarity in some of these roles. So, generally, it lays out a lot of the same rights: a right to access, a right to correct data, a right to delete, a right to opt out, and a right of portability.
But, even here, there are some differences, because each of these laws uses different definitions of personally identifiable information or sensitive information. So, basically, each law puts different information within scope. And then there are some other smaller variations. But, again, the problem with these laws is even some of these smaller variations really just creates a lot of inconsistency. So if you are in one state and you move to another state, how your data is treated is different. And that creates confusion for businesses. And it creates confusion for consumers.
Jennifer Huddleston: It seems like those inconsistencies are a growing concern. And we’ll return to those in a bit. Kier, I know you said you were hesitant to make predictions, in terms of what the landscape would look like. But it seems like one thing, when it comes to comprehensive state data privacy laws, that’s been happening in 2022, is that we’ve seen various models, various proposals, introduced in both red and blue states.
When we’re looking at kind of the different models of legislation that have been introduced, I’m curious, what trends are you seeing? And, what, if any, trends are you seeing in the states or legislation, where certain models tend to be moving forward more significantly when it comes to those issues?
Keir Lamont: Sure. So I think a big question for a lot of people has been whether the states will continue to splinter off in their own different directions, as Daniel discussed, or whether we will see legislation across the states begin to kind of coalesce around a particular approach, be it something of a California-style model, a Virginia-style model, or one of the other distinct approaches that we’ve discussed.
Now, about three and a half months into the year, it is starting to look like we’re able to project the emergence of a dominant model. California-style bills have largely faltered, while Virginia-style bills have seen far greater success. Lawmakers have typically had a very difficult time advancing legislation that is modeled on either the California CCPA, or the impending CPRA. In this year alone, in Hawaii, a bill modeled on the CPRA was voted down in committee. A CCPA-style bill in Maryland looks like it will be replaced with a committee-study bill. CCPA-style rules were struck from a bill in Maine. And, in Indiana, a California-style bill was replaced with a Virginia-style bill. And those are just the California-style bills that have seen any action. Others have not moved at all.
Now, on the other hand, those modeled on Virginia law have typically seen more success. This year, Virginia, or VCDPA-style bills passed the Wisconsin assembly and the Indiana Senate. And while those bills did not make it over the finish line before their sessions expired, there is still active privacy legislation that follows the Virginia framework moving in Utah, Iowa, and Connecticut. If we continue to see states agree on a single approach to privacy, even if those states kind of iterate on that underlying framework, it’s possible that concerns about the emergence of an unworkable patchwork of disparate privacy standards could diminish.
I will note that likely the biggest outlier in this trend that I’ve pointed out has been Florida, where, perhaps surprisingly, a Republican-driven privacy bill — modeled on the CCPA and containing a relatively broad private right of action — has passed the state house in the last two sessions, though it failed to gain much traction in the state senate. However, I can say this with love, as a ten-year Florida resident; Florida can have a tendency to do things in its own way.
Now, picking up on another point that you mentioned, coming into this year, the states that have enacted comprehensive privacy laws — California, Virginia, and Colorado — all, at the time of enactment, had trifecta Democratic governments. So a big question has been whether a proverbial red state would adopt privacy legislation this year. And, in a very exciting development, it appears that Utah is on the cusp of doing so. The Utah Consumer Privacy Act cleared the legislature by unanimous votes on March 3, and will soon be transmitted to Governor Cox for his signature.
As I mentioned, the Utah bill generally follows the Virginia framework. And, if enacted, Utah would actually be unlikely to introduce any unfamiliar compliance obligations for companies that are already preparing to comply with the Virginia law. However, the Utah law would also provide for significantly fewer consumer rights and protections.
For example, unlike the Virginia law, the Utah bill does not provide a consumer right to correct inaccurate data, and provides a far narrower consumer right to delete personal data. The Utah law also does not require companies to obtain opt-in consent for the use of sensitive personal data. The Utah law also does not establish requirements around non-discrimination, or minimizing the collection and secondary uses of data.
Utah also does not require companies to conduct data-protection impact assessments. And Utah does also not provide for consumer right to opt out of the processing of data for high-risk profiling decisions. And those are significant divergences. And that was something of an incomplete list.
So, I am very closely watching to see what, if any, impact the anticipated enactment of the Utah legislation will have on other states. In an early indication, just yesterday, lawmakers in Iowa amended Virginia-style privacy legislation to bring it into closer alignment with Utah. And that bill then passed the state house in Iowa on a 91-2 vote. So it is possible that we will see Utah emerge as a distinct red-state model for governing the use of personal data, which would be largely interoperable with some of our existing privacy laws on the books — Virginia and Colorado, but, arguably, not extending as far.
Jennifer Huddleston: Thank you for that kind of overview of what these trends look like and what has made 2022 a bit unique. It sounds like if we were having this conversation a year from now, if we were sitting here in March 2023, that the title of this would be at least a little bit longer, that we’d probably have at least one or two, or even as many as four or five states to add to the title when talking about states with comprehensive data-privacy law.
Daniel, I know you’ve written about the growing cost that the state data privacy laws can have, and particularly how it impacts small companies and startups. Can you tell us a bit more about what you’ve found, and, particularly, how you would expect this to change as the number of states that may be passing this type of legislation grows?
Daniel Castro: Absolutely. So, as Keir mentioned, between 2018 and 2021, that three-year period, thirty-four states either passed or introduced a total of 72 data privacy bills. And we’re seeing this trend continue, so, Utah, Florida, Washington. So the point of this here is that, barring federal intervention, we are likely going to see in the United States this continued increase in complexity, a complex regulatory landscape for data privacy continue to develop as more and more states go down the path of creating laws or considering laws.
And we don’t have to speculate too much about the future to make this prediction. All we have to do is look back at the past history of data privacy laws. Because, even though we’re having this debate now about do we need comprehensive data privacy laws, as opposed to sectoral laws, which is what the United States has done in the past, we have seen states pass privacy laws in the past. And we’ve also seen the kind of, this type of scenario where one state does something that a bunch of other states think is a good idea, and they all model it.
And the clearest example of that is data breach notification laws. As of last year, all 50 states now have a data breach notification law. It took about 15 years for us to get there. But they all did it. And they all did it a little differently. That is also an example of where it would have been a lot easier to have a single federal law that clearly states, “This is what a data breach is. This is who you need to notify. This is when — these are the protections you can put in place to encrypt data so that if data that’s encrypted is released, it’s not a data breach.”
There’s a lot that can be done at the federal level to standardize that. And we never did it. And, instead, now we have 50 different data breach laws. Now data breach laws are fairly narrow in scope. So, even though there’s variation, that variation is somewhat limited. Comprehensive privacy laws, on the other hand, as you are hearing, is significantly broader.
So, in January, I coauthored this report that took a look at what types of costs might we expect if this trend of a patchwork of data privacy laws continues. And our goal was to get a sense of what the additional compliance costs would be, what the difference, what the increase, in terms of market inefficiencies, would be. And then also any other kind of economic implications that we might see if we have this patchwork of state laws versus a single federal law.
And so our report found, after doing this economic model, that a patchwork of 50 state privacy laws could impose total out-of-state costs of between $98 to $112 billion annually. So what I mean by that number is this isn’t the cost just of in-state compliance, but the cost of within-the-state compliant with all the 49 other state laws. That would be $98 to $112 billion annually. So, over a ten-year period, you’re looking at total out-of-state costs exceeding $1 trillion. And, then, for small businesses specifically, their out-of-state costs would be between 20 and 23 billion dollars annually.
And these are big numbers in this report, but they aligned very closely with what we’ve seen in other studies. So, for example, when California passed its first law, the CCPA, as part of the process, California commissioned its own study to estimate what would be the compliance burden for the CCPA. And its own commissioned study found that the compliance costs would at least be 55 billion. And that was just to California businesses. It didn’t even consider the out-of-state impact of the law.
And so we’re very concerned about the trend and the impact that this would have. Because so many of these laws are built off of the premise that was the GDPR. They saw the GDPR. They thought, “We need a comprehensive law.” And they want to copy that. But one of the most important lessons from the European Union of the GDPR was that the GDPR was intended to create a digital single market throughout Europe. It was intended to harmonize laws across all the European member states, because they recognized that they were getting incredibly inefficient and it was hurting their digital economy by not having a single law.
And so what’s striking to me is that in the United States we kind of took the wrong lesson. We only took the part of the lesson that said we need a privacy law, not the part of the lesson from the GDPR which is that we need to have a harmonized law across our economy if we want to have an efficient and productive digital economy.
Jennifer Huddleston: When we’ve been talking about data privacy laws today, so far we’ve been talking about these broad, general, consumer-privacy legislative proposals that we’ve seen. But there are other cases where we’ve also seen states considering data privacy-related legislation. For example, both Maryland and Maine have considered biometric information privacy proposals closely modeled after an existing proposal in Illinois. I’m curious, what impact could these more specific laws that may be looking at one particular type of data have on innovation and consumers? And, what, if any, trends are you guys noticing in those type of more specific data privacy actions at a state level as well?
Daniel Castro: Well, so I can jump in on that. So, look, in my view, biometric data, which you bring up, is an inherently sensitive and risky category of personal information. It is often said that you can change your password, you can even change your social security number, but you can’t change your face. So, in my view, it is absolutely appropriate for lawmakers to be exploring ways to provide heightened protection for biometric information, whether it be in the context of a comprehensive privacy framework or in a stand-alone bill.
At the same time, the Illinois Biometric Information Privacy Act, which you mentioned, went on the books way back in 2008. And, given modern technology, modern risks, and current business practices, there are legitimate questions concerning both BIPA’s definitions and its approach to enforcement that other states looking at BIPA as a potential model for legislation today, in 2022, should carefully consider.
On biometrics, I am actually most interested to see if the Illinois legislature adopts a bipartisan amendment to the BIPA law that would create certain carveouts to more readily permit the use of biometric information for legitimate security purposes.
In terms of other sectoral privacy laws to protect consumers and enable responsible uses of data, one of the areas that I’m most optimistic for is in the regulation of direct-to-consumer genetic testing services, which is another extremely sensitive category of data. In the past year, California, Utah, and Arizona have all adopted strong laws governing the use of genetic data that was supported by industry and consumer groups alike.
These laws are also largely consistent with a best practices document that my organization, the Future Privacy Forum, published in 2018. So that’s an obligatory plug. So far this year, we have seen Wyoming adopt a law that follows this approach. And additional states may be getting ready to follow suit.
And, then, finally, I would also expect in the coming year to see states increasingly consider legislation focused on protecting the online privacy and safety of children and adolescents, beyond the existing federal COPPA statute. One particularly notable bipartisan bill is in California, which would hold companies responsible for following an age-appropriate design code, similar to a governance framework that was recently adopted in the United Kingdom.
Jennifer Huddleston: Daniel, do you have anything to add? What are you noticing, with regards to some of these additional privacy considerations that states may be approaching?
Daniel Castro: Well, I certainly agree that biometrics and children’s privacy are probably the two hottest issues right now, and ones that states are looking at. And, I would say, with biometrics, I think there’s often some confusion among policy-makers about what the concern is. So it is true that you can’t change your biometrics. You can’t change your face. But, also, your face isn’t private. So when we start talking about privacy, and we’re starting to talk about biometrics, sometimes there’s a little bit of confusion, because we’re not actually talking about information that is necessarily private.
And so it gets a little tricky, right? So a faceprint, which is a digital representation of your face that’s converted into kind of a smaller amount of data, kind of like a fingerprint isn’t your finger, but it’s just a representation of part of your finger — that faceprint is the biometric. But then the photo isn’t. But you can create a faceprint from a photo.
So when we start trying to define certain things as biometrics and other things not, it becomes tricky. And the other thing that becomes tricky in there too is that a lot of these biometrics are not standardized. These are proprietary data representations. They’re based on a specific vendor’s algorithm. It’s not like you can necessarily take a faceprint from one vendor system and stick it into another’s and there’s any kind of compatibility there. It’s just kind of garbage data at that point.
So there are some really interesting questions that come out here. And there’s some — obviously, there are legitimate privacy concerns as well, because of the fact that biometrics can help uniquely identify people. So there’s questions about whether biometrics can be used for surveillance or other types of problematic uses.
So the point where I’m going with that, though, is that I think businesses are rightly concerned about their exposure to these different laws that states are passing. So, for example, in Illinois, that state had passed a biometrics law a while back. And Google wouldn’t release its arts and culture app that allowed people to take a selfie and see if their face was closely matched to any kind of famous portrait or sculpture that was in their arts gallery, because of the Illinois biometric law. We saw other companies as well, basically, block their product from the market, because they were concerned about the potential risks, exposure, of being there.
And so I think that’s a concern going forward. That’s definitely a concern if California moves forward with some of its proposed laws around children’s privacy. And one of the reasons for this is because we’ve already seen companies have to settle very expensive lawsuits from their exposure from violating these laws. And what was particularly problematic in Illinois is they didn’t have to show harm to have a violation. And once the court ruled that you didn’t have to show any kind of harm to go after these companies, we saw a flood of lawsuits.
So the number of lawsuits for BIPPA in 2019 was around 300, and then in 2020, it doubled. Last year, we saw ADP, for example, settle for 25 million. Walmart settled for10 million. Six Flags settled for 36 million. TikTok settled for 92 million. And Facebook settled for 650 million. All last year. All with just the Illinois biometrics law.
And so when you start talking about, well, 49 other states also having similar laws — maybe on children’s privacy, maybe on biometrics — I think that can become really problematic for these companies that, again, this isn’t a lawsuit because there was a major data breach and lots of information was exposed, or consumers were able to show significant harm. This was a technical violation, in terms of — they, for example, at Walmart, they asked their employees to use a fingerprint reader as an option for checking out cash for a cash register, or using a pen. And, because they had that biometric option, they assessed a significant fine.
And in California they also had — we’ve seen over 190 lawsuits in 2020, all related to this private right of action. So, my concern, as we’re seeing states start looking at some of these smaller areas, is that they might start enacting some of these policies that can have a very significant impact, in terms of access to market, and in terms of potential costs that could be imposed on consumers, or products that will be taken away.
One good story, I think, that’s out there is in Virginia, which, just recently, or I guess last year, they passed a ban on facial recognition. But, then, just recently, in the past few days, they rolled back that ban, And, instead, they put out kind of rules of the road. So they said that police and law enforcement can use facial recognition, but only under specific conditions. And they outlined that in the bill. And I think that’s the right type of approach. They’re really trying to stake out a middle ground.
So I’m hopeful that more states — if they decide to start going down that route of legislating around biometrics, or legislating around children’s privacy — aren’t really going to go down this route of putting up private rights of action or expensive provisions, but are really about finding a narrow balance, or waiting and working with federal legislators on crafting something that would be national rules.
Jennifer Huddleston: You know, all three of us have been involved in this debate around data privacy for a number of years now, and have watched it evolve and change, particularly around questions about states and data privacy, but also about the landscape of data privacy more general. One thing I will say that has emerged over the past five or so years — starting with the emergence of the GDPR, and then some of the concerns in Congress following the Cambridge Analytica incident, and then CCPA, followed by CPRA, followed by all these other states — is that there does seem to be general agreement among advocates, experts, consumers, and innovators that a federal data privacy standard in the US would be preferable to a state-by-state approach.
I’m curious. I know that it’s always tough to make a prediction. But what do you both think about the possibility of a federal data privacy standard? And how do you think the growing patchwork of state laws, the increasing number of state laws that we see, impacts the debate over a federal data privacy standard?
Keir Lamont: I think that is a really great question. And an initial point that I would first like to raise is that, while I expect that you’re correct that most consumer advocate groups would welcome a robust federal privacy law, many of these organizations would not want a federal law to broadly preempt or limit the ability of states to enact additional privacy rights and protections on their own. So this is an issue of preemption. And it is one of the factors that has made efforts to advance federal privacy legislation particularly difficult.
Now, in my view, a federal privacy law is, ultimately, absolutely necessary. As we all know, the internet does not stop at state borders. And it makes no sense for my friends in Richmond, Virginia to enjoy greater privacy rights than my friends in Salt Lake City, Utah, or for those friends in Salt Lake City, Utah to enjoy greater privacy rights than me sitting in my apartment here in D.C.
Establishing strong federal privacy legislation would also have additional benefits of promoting individuals’ trust in the digital economy, and also helping to remove barriers to America’s digital trade, making U.S. industry and innovation more competitive on the world stage.
Prognosticating a little bit, though, I think that, while federal lawmakers have introduced over a dozen comprehensive privacy bills in recent years, and have also held numerous hearings on the subject, I don’t expect that a new law is imminent, especially given that we are in an election year. Many stakeholders have pointed to enforcement issues, some of which Daniel raised, around questions of how to scope or whether or not to include a private right of action.
There’s also been sticking points around the issue of preemption, as I have mentioned. However, on one level, these issues are well-understood, and, at the end of the day, can be resolved through a political compromise. I worry that us kind of in this community, and our tendency to focus on these specific enforcement issues, has created a tendency for us to underrate the significant complexities in coming to agreement on the specific statutory language establishing substantive privacy rights and protections.
There has also been an increasing desire for comprehensive privacy legislation to advance civil rights protections, and to regulate against discriminatory uses of data. And I expect that more work needs to be done on those issues.
I understand that NTIA, the National Telecommunications and Information Administration, is expected to publish a request for comment on this topic soon. But, going back to another point you’ve raised, and how the evolving state privacy landscape may influence the federal debate, first off, I am interested in how the adoption of privacy legislation in a place like Utah or another so-called red state such as Iowa may trickle up to Republican policy-makers at the federal level, and may impact the contours of the national debate.
Second, I’m interested if the adoption of privacy laws like the Utah Consumer Privacy Act, that tend not to introduce additional new or unfamiliar compliance obligations for businesses that are already regulated under another one of these laws, may lessen some of the concerns that Daniel articulated about a so-called patchwork of conflicting state legislation.
It’s possible that that would reduce the urgency that some policy-makers and stakeholders feel about the need to advance comprehensive federal privacy legislation, which would be unfortunate. Alternatively, I would say that if a state adopts comprehensive privacy legislation that would include either a broad private right of action or is based on a significantly divergent regulatory framework, I would expect that could turn the heat back up in negotiations and conversations about federal privacy legislation. But that’s something we’ll have to wait and see how the state landscape continues to unfold.
Daniel Castro: Yeah. I’ll just build on that. I think you raise a lot of great points. I would say that, in terms of — there’s a few things we should keep in mind. So, one, if a privacy law starts to get more momentum in Congress, of course people want to start piling on new issues. So it’s no longer just about establishing some basic consumer privacy rights. It’s now about, well, should we also be regulating algorithms in A.I.? Should we expand to do more on social media? Should this now focus on reforming the FTC or establishing a new digital regulator at the national level?
So one of the challenges of getting any law passed is that as soon as you get momentum, there’s a lot of force being applied in the opposite direction, or weight being added on, because it’s a moving vehicle and people want to put their own ideas into it, which is fine, except we’ve had a few years, and I think we’ve mostly gone to consensus around a kind of common-sense federal privacy law. But, unfortunately, there are still sticking points around two key issues.
One is preemption. I think most businesses would think, “What’s the point of a federal law if you’re not going to do preemption?” That’s not going to help make things simpler for businesses to comply. It’s not going to simplify rules of the road for consumers so that you can communicate clearly about these are their rights, regardless of where they live. So there’s a lot of concern about that.
I think, in terms of competitiveness, we also have to keep that in mind at the global level. Europe still looks at the United States and says we don’t have a privacy law. And it doesn’t matter if California passes its own law. It doesn’t matter if all 50 states pass their own state-level laws. They’ll still say the United States doesn’t have a federal law and there’s not a federal regulator, and will continue to use that as a way to penalize the United States and limit transatlantic data flows. I think getting a federal data privacy legislation is important for showing, internationally, that the U.S. is taking this issue seriously.
And, just in terms of kind of the politics of this, I guess the last point I would just make is that, at the very beginning, if we have a few states that have passed laws, I think that’s probably when it’s easiest for Congress to do something. I think this is when Congress has the biggest opportunity. As more and more states go down this road, it’s going to be much harder, particularly when you have ten, fifteen, twenty states. Because then you have a lot of stakeholders who are saying, “Well, why are you telling us we can’t do the privacy law that we just passed?”
But, of course, the more you get, the more essential it becomes, because then you have this really complicated patchwork of laws. So I’d say the second-worst outcome is passing a federal law after all the states have already done it, because that’s when you’ve already sunk all these costs into complying with all these different laws. And then you finally overturn that and you have one federal standard.
But, even with that, I would say the worst possible outcome would still be not having a federal privacy law, because those costs over time are just going to continue to mount and become more complex. And, again, we’re going to kind of lose on the international level.
Jennifer Huddleston: So now I’m going to turn to some of the questions that have come in from our audience over the course of this discussion, starting with, of course, another big element in the debate about data privacy legislation, at both a state and a federal level, is about private right of action. Could you speak to a bit about the variety of private right of action that we’ve seen in different bills, is there a state that either of you feel gets it right for consumers, and the impact that these private rights of action have on businesses and consumers alike?
Keir Lamont: So I can lead off by saying that although there’s been significant debate over whether or not to include a private right of action in many states that have considered comprehensive privacy legislation, to date, no state law that has included a broad private right of action has actually made it over the finish line.
California, as Daniel mentioned, does contain something of a narrow private right of action. But that’s limited for cases of data breaches, not violations of kind of the substantive privacy attributes of the law. We are seeing states kind of continue to iterate a little bit on their ideas of private rights of action, especially in Florida this past year. The private right of action caveat in that bill was cabined to certain particular violations, such as inappropriately selling data, or violations of the child’s privacy protections within that legislation.
We’re also seeing increasingly nuanced conversations about what a private right of action — the ability to sue in your own capacity to enforce the law — should actually look like, or what should you be able to recover. Florida tried to limit the availability of big statutory damages just if you were suing a large corporation for violation of the law.
In other states, we’ve seen ideas floated around that say no statutory damages, however, you can recover injunctive relief. In Washington State, I believe there was a bill considered that provided a mechanism where a regulator would first determine whether or not there was a violation that had occurred that could, on a secondary basis, then enable private rights of action.
So, to date, it’s still an issue. States are still debating it. And we have to see what will develop.
Daniel Castro: I’ll just add on. I think we’re talking about private right of action. We should, of course, go back to what the purpose is. The purpose is to ensure effective enforcement of the law. And so I think if there is one — and I don’t think it’s always necessary, because I think sometimes, at the federal level, ideally, it’s being enforced through the FTC or potentially through state attorney generals who can bring their own actions still under the same law.
But if you do have a private right of action, I think it should be limited to only when there’s harm. Or it should be limited to when you’re seeking injunctive relief. One of my concerns is that some of these, many of these laws often have vague provisions where you have the law that passed, and then you have the rulemaking that proceeds from that. And a lot of these companies and businesses that are operating in these states are really uncertain about how the rules will be written and how they will be enforced.
And so when you have a private right of action and you have ambiguity, that’s, to me, a recipe for problems. It creates a high-risk regulatory environment that disincentivizes companies from collecting and using data, even in ways that are benefiting consumers. For example — and also what’s important to remember is even in these laws that are more narrow, of course, they could be amended in the future to add more provisions.
So, for example, we’ve seen a lot of discussion around things like dark patterns, which is a way of saying that there’s something in the design of a software interface that people find potentially problematic. Again, a very vague concept, but one that a lot of regulators are looking at, but one in which it’s so open-ended that it is very easy to find fault with a particular company’s product design.
And, so, again, if you open this up to private rights of action, you are really opening the door to many different expensive lawsuits. And I think that’s something that, again, many businesses are particularly concerned about.
Jennifer Huddleston: Thanks. Our next question is if you two could discuss a bit about the extraterritorial elements and problems that come to these — that come with these state data privacy laws. And, what, if any, approaches you think exist to address this problem, particularly as we have a growing number of states considering these laws. And, Daniel, we’ll start with you this time.
Daniel Castro: Sure. I think it’s a really interesting question. Certainly, we’ve seen this problem crop up before. You have apps on the various app stores where the FTC sends letters saying to the app developers, “You’re not handling children data correctly.” And they mail them to China, and they never hear back. And they might remain up there. And, then, at the same time, U.S. companies are, of course, held to the standard better, because they’re here.
So this has always been a problem. I think — we were talking about states. There’s also the extraterritorial enforcement, in terms of enforcing this on out-of-state but domestic companies, which is a challenge. And it’s a challenge for many reasons. I guess the question — the root of some of this question, though — is, can, for example, a foreign company acquire a domestic company and transfer their data.
And, there, I think there is more — there’s more consideration of some of these laws, either on setting requirements for data transfers or contractual requirements, or just disclosures about what companies can and can’t do with data that, ideally, will include provisions about how they’d handle acquisitions. But that’s not always spelled out clearly. And, again, it can vary from state to state.
Keir Lamont: Yeah. It’s a really great question. And I would just add, an additional complexity in U.S. state extraterritorial enforcement is that, for many businesses, especially operating in an online context, to figure out where one of your customers is coming from and what specific privacy rights and protections may apply to their data, you may actually have to create, collect, and associate more information with that customer to figure out where they’re visiting your website from.
So that’s kind of another issue in this space. And the solution to many of these issues will be, ultimately, enacting federal privacy legislation. So I hope we get there.
Jennifer Huddleston: Well I think we have time for one last question. And I’m going to kind of modify a bit of a question that was asked by the audience. I’m curious; what impact, in terms of business decisions, do either of you think that the different requirements around size thresholds that we’ve seen in many of these state-level data privacy law have?
Is there an incentive to — for companies that may be nearing that threshold — to either split themselves to try and avoid it to try and keep their user base low, perhaps, or even to, perhaps, sell to a larger company that may be able to deal with some — already have existing compliance measures? And so if you guys could speak a little bit to the impact that you see in these different thresholds regarding who is covered in that user or revenue requirement.
Daniel Castro: Maybe I can start off. I’ll say just one of the — I think these, the laws that try and put these caps on only applying the law to a certain size company, are hugely problematic. Because, think about this from the consumer perspective. A consumer cares about their data, just full stop, right? They don’t care that there is a data breach from an app that had 100,000 users, versus one that had 90,000 users, and that only the one with 100,000 users has to follow this law and the one with 90,000 users doesn’t.
They don’t care. It they hear that there’s a privacy law passed, they expect these provisions to apply across the board. And, of course, a lot of times, lawmakers are putting these caps in place because they’re trying to say we’re not trying to harm small businesses. But that just ends up not being effective for consumers. And then it allows them to kind of — it gives them more leeway to, perhaps, pass provisions that are expensive and inefficient. And they’re saying, “Well, it’s okay, because it’s applying to a large business.” But there’s no reason not to have an efficient law that applies to businesses of all sizes.
So I think the starting place is we should be a little skeptical of some of these size caps. That said, some of these size caps certainly will, I think, make any company that is on the cusp look closely about what they can do to stay under it, because going over will expose them to new risk. Going over will expose them to new obligations that maybe there’s really no reason for them to want to pursue.
That said, there’s going to be a lot of companies that are, of course, significantly over it. Because, oftentimes, you can easily have a contact list of a certain size or have a certain amount of revenue. And that doesn’t mean you’re a big company. It just means that you are a company operating in a digital environment where you do collect a lot of data from customers and potential customers. And, because of that, you’re hitting these thresholds.
Keir Lamont: I agree with a lot of what Daniel said there. Privacy harms, the risks from using data, emerge from how that data is used — wrong or unfair uses of that personal information — not from how many different other individuals’ or consumers’ particular company or processor of data collects and holds.
At the same time, it does make sense why lawmakers would seek to lessen the regulatory burdens and onus on, particularly, small industry. So I understand where they’re coming from in attempting to set some of these small business carveouts. Though, as Daniel said, there are problematic results from that.
And I suppose I would close by pointing to a point that I actually expected Daniel to raise, which is that of all four likely privacy laws here — California, Virginia, Colorado, and Utah — all four of those states will have different thresholds and cutoffs for whether or not a business collecting a particular amount of data on consumers of that state, or doing a set amount of business in that state, will ultimately be covered under that state’s privacy law or not. And that is a question that companies are going to have to ask and figure out with some degree of attention as they work through their compliance obligations.
Jennifer Huddleston: Well, thank you guys for a lively and informative discussion. I’m sure that this map continues to get more and more complicated. Before I turn it over to Colton for some closing remarks, Kier and Daniel, could you each tell our listeners, or people who may be listening to this when it goes out as a podcast, where they can find out more about you and your work? So I’ll start with Kier.
Keir Lamont: Sure. Well it’s a really fascinating time to be paying attention to this space. And states are moving incredibly frequently. So track me down online. I’m not hard to find. Follow the Future Privacy Forum website. And we are doing our best to keep policymakers and the general public up-to-date and informed on what the states are doing when it comes to privacy legislation.
Jennifer Huddleston: And Daniel?
Daniel Castro: Yeah. Thanks so much, Jennifer. It was a pleasure to be a part of this conversation. You can find my work with the Information Technology and Innovation Foundation at ITIF.org., or the Center for Data Innovation at datainnovation.org. I’m always happy to chat with people online on Twitter, as well, @castrotech.
Jennifer Huddleston: Great. And I am online. Our website is netchoice.org., and on Twitter, @jrhuddles. And, with that, I will turn it over to Colton and The Regulatory Transparency Project for some comments to close us out.
Colton Graub: Thank you Daniel, Keir, and Jennifer. We are very grateful to you all for your time today and for the insightful discussion on this important and very timely topic. To our audience, for those of you who joined the conversation midway through, you can listen to the recording when it is released via our podcast feed in the coming days. We welcome listener feedback by email at [email protected] Thank you all for joining us. This concludes today’s call.
Conclusion: On behalf of The Federalist Society’s Regulatory Transparency Project, thanks for tuning in to the Fourth Branch podcast. To catch every new episode when it’s released, you can subscribe on Apple Podcasts, Google Play, and Spreaker. For the latest from RTP, please visit our website at www.regproject.org.
This has been a FedSoc audio production.
Vice President and Director, Center for Data Innovation
Information Technology and Innovation Foundation
Future of Privacy Forum
Technology Policy Research Fellow
The Federalist Society and Regulatory Transparency Project take no position on particular legal or public policy matters. All expressions of opinion are those of the speaker(s). To join the debate, please email us at [email protected].