Explainer Episode 50 – FTC on Privacy: The Statutory Authority Behind the Plan
Last November, the Federal Trade Commission accepted comments on its proposal to start a rulemaking related to “Commercial Surveillance” – the agency’s newly minted term for any and all business use of data about customers. The FTC has for decades sought to protect consumer privacy and data security through case-by-case application of its general consumer protection authority. It also is charged with rulemaking in a few narrow areas of privacy and data security. In practice, the FTC has become the U.S.’s primary privacy and data security enforcer. Now, while congress deliberates on whether and how to adopt a general privacy law, the FTC seeks to fill a perceived gap with agency rules. Does the agency have the authority to do so? What can we learn from the proceeding thus far? What are the agency’s likely next steps and will it succeed? Our participants will discuss the proceeding, their participation in it, and what comes next.
Although this transcript is largely accurate, in some cases it could be incomplete or inaccurate due to inaudible passages or transcription errors.
[Music and Narration]
Introduction: Welcome to the Regulatory Transparency Project’s Fourth Branch podcast series. All expressions of opinion are those of the speaker.
Chayila Kleist: Hello. Welcome to the Regulatory Transparency Project’s podcast. My name is Chayila Kleist, and I’m Assistant Director at the Regulatory Transparency Project here at The Federalist Society. Today, we are delighted to host a discussion on the Federal Trade Commission’s recent actions implicating privacy and the statutory authority behind the plan.
To address our topic today, we have with us two members of RTP’s Cyber and Privacy Working Group. The first of these, Neil Chilson, who is a Senior Research Fellow for Technology and Innovation at Stand Together Trust, where he spearheads the Trust’s effort to foster an environment that encourages innovation and the individual and societal progress that makes it possible.
Prior to joining STT, Mr. Chilson worked at the Federal Trade Commission on two different capacities—most recently as the FTC’s chief technologist and before that as an advisor to then-Acting FTC Chairperson, Maureen K. Ohlhausen. In both roles, he advised Chairman Ohlhausen and worked with the Commission staff on nearly every major technology-related case report workshop and proceeding.
Also joining us today as our guest host for this podcast is Ashley Baker, who’s the Director of Public Policy at the Committee for Justice. Her focus areas include the Supreme Court, technology, regulatory policy, and judicial nominations. Her writings appeared in Fox News, USA Today, The Boston Globe, The Hill, Real Clear Politics, and elsewhere. Much of Ms. Baker’s work is at the intersection of courts, regulation and technology, and thus she engages in policy analysis and outreach on legislation and regulations related to these issues, writing op eds, letters to Congress for committee hearings, and regulatory comments.
Now, in the interest of time, I’ve kept my introductions of our guests brief, but if you’d like to know more about either of our guests today, please feel free to visit www.regproject.org and read their impressive full bios.
With that, however, I’ll hand it over to our host. Ms. Baker, the mic is yours.
Ashley Baker: I’m sorry. Thank you. And thank you for having us here today. So by way of background, last November, the Federal Trade Commission solicited comments on its Advanced Notice of Proposed Rulemaking which would initiate a rulemaking on privacy or what it terms commercial surveillance—the agency’s newly minted term for any and all business use of data about customers. The FTC has, for decades, sought to protect consumer privacy and data security through case-by-case application of its general consumer protection authority. It is also charged with rulemaking in a few narrow-use cases—its privacy and data security—but, in practice, the FTC has become the U.S.’s major privacy and data security enforcer, especially since we do not have a federal data privacy law.
So this comes at the same time as Congress is deliberating whether or not to pass a law, and we’re here today to discuss a lot of things related to this, including what’s next in the rulemaking process, what exactly the notice entailed, whether or not the Federal Trade Commission has the authority to do some of the things that it seems to be contemplating in its notice.
And I have here with us, Neil Chilson, who submitted comments on the ANPRM, and my first comment for you — question for you, Neil, is since the FTC has long been the primary enforcer of privacy, what is new that the FTC is trying to accomplish here? What is different with this versus its approach to privacy and data security on a case-by-case basis, which it’s done pretty effectively, for the most part, for many years?
Neil Chilson: Well, it’s good to be here, and thanks, Ashley, for that question. And I know that you also submitted comments on this, so I hope we’ll get a chance to talk about your comments as well. So as you mentioned, the Federal Trade Commission’s approach to privacy and data security in the past has been one that is focused on its statutory authority to protect consumers from unfair and deceptive acts or practices. And the typical way that it has done that is by holding companies to the promises that they make to consumers around their privacy policies and the use of consumers’ data.
What the FTC, I think, is trying to do here is — would, in some ways, be a substitute for what Congress might be trying to do, which is — has also tried to do, which is to set some rules about specific practices that the companies have to follow. As I mentioned, right now, if a company — primarily, if a company explains what it’s doing to you with your data in a way that’s accurate and it isn’t doing something that causes harm that can’t be avoided by the consumer and that isn’t outweighed by benefits to consumers or the competition—that’s the unfairness test—then there aren’t really set rules for what the companies can do.
The FTC has put out guidelines that show best practices for what they think. And these guidelines have been in place since 2009, at least—2009/2012. There’s been some various iterations of that in reports that the Commission has put out. And so, those have been guidelines to business, but ultimately, it’s been whether or not companies lied to consumers about what they were doing with data. So this rulemaking, I think, is the first step by the agency to try and to make substantive rules.
Unfortunately, the ANPR is so broad that it’s, actually — it’s not really clear at all what types of rules they might want to do. The range here could be everything from not do anything to—if you read some of the comments—just literally banning all collection of data without — banning, actually, targeted advertising in the industry in its entirety. And so, there’s quite a wide range. The ANPR asks, I think, 95 different questions, and they’re all framed very much with a serious skepticism about any benefits to consumers that might come from what—as you point out—the FTC terms “commercial surveillance.”
I could keep talking about the scope of the rule and the scope of the ANPR—it’s very broad—but I wonder if you had any thoughts that you might — wanted to reflect on that?
Ashley Baker: Sure. I guess to start where you left off on its being 95 questions and not very clear what they want to do here and it’s — that seems to be kind of a trend, too, amongst the Federal Trade Commission recently — is to do something like propose 95 questions and half of them are like, “Please give us examples why we are correct in what we are suggesting we want to do right here.” And they’re all very leading questions. But still, at the end of it, it doesn’t give you any clear idea of what they actually want to do because the rule would have to be a lot narrower than the many, many things they are fishing for here. It’s also hard, too, for those of us who file comments of — “How can you be helpful here? Where do I start?” You can’t answer all 95 of them, nor would it be helpful for you to do so. There are some process concerns there.
Let’s, I guess, back up a bit to why the FTC should or shouldn’t do this as a rulemaking versus Congress. I mean, there are downsides, too. Right? So when the FTC was the rule — would have the authority, they wouldn’t necessarily have federal preemption states and also — the Commission’s also pretty limited in its authority to implement remedies by regulation. What are your thoughts on that?
Neil Chilson: Yeah. So I think that’s a great point. I mean, there is the basic separation of powers constitutional issue that, typically, we want Congress to write our laws, not unelected bureaucrats. And so, the fact that Congress is considering writing laws here suggests that there is a big, at least, perceived gap in the legal system. And for the agency to step into that gap raises a bunch of constitutional questions, not least of all the major questions doctrine. Did Congress intend for this agency to write rules of this type if Congress simultaneously is contemplating writing rules of this type? So I do think that there are some interesting statutory authority problems with this approach. But stepping back from that, the broadness of the ANPR — I think you point out a really great point. There is a weakness here that, when you have such a broad ANPR under administrative law, you may not be providing sufficient notice to people to provide comments.
Now, this is the start of a long process, so I think the FTC will, if it continues to proceed with this rulemaking, have additional opportunities for comment. But the current ANPR is, I think — I argue in my comments—which I did with Jim Harper from AEI—that the current one is so framed up in a way that it is very unlikely to provide the type of robust record that the agency would need to survive a court challenge. And I think nothing typifies that more than the use of the term commercial surveillance, which is right in the title of the rulemaking. It is not only completely unprecedented in the FTC’s 20 years of privacy and data security practices, it’s pejorative. Right? Surveillance very much connotates a power dynamic between the party being surveilled and the surveiller [sic], and the ANPR makes no bones about the fact that it thinks that there is this really disadvantaged power dynamic between consumers and users of internet websites and the providers of those services.
Problematically, the definition that the Commission uses for commercial surveillance is — as you pointed out, it’s so broad it’s not even limited to online companies. It is basically anytime a company gathers any information about a consumer. That could include—and it does include, under the various — under the definition of the term in the ANPR—giving the shipping address to a company so that they could send you information. This is something that is so common and not only commonplace but necessary to commerce that it’s not clear to me why the Commission would use such a pejorative term for it.
And I think that type of extreme vagueness means that it’s very difficult, as you pointed out, to engage productively with the proposals that the Commission has—to the extent they have them—because if they’re defining, in the scope of commercial surveillance, practices that everybody agrees are completely necessary to commerce, then the scope of the potential rule could be extremely broad, too, reaching pretty much any interaction that a consumer has with a business.
So I don’t know. We focus very much on that high-level framing for the exact reasons that you pointed out; it’s very hard to know how to engage with this rulemaking. And our ultimate recommendation was that the Commission go back to the drawing board issue — is think really hard about what privacy is, look back at its past practices, and build from that to do a new ANPR that actually offers some substance to which commenters could offer thoughtful responses.
Ashley Baker: And you’re right about [inaudible 12:23] commercial surveillance. It’s a term that’s particularly gloomy. The Commission’s been — they’ve been doing a lot of that lately—you have dart patterns—and then there’s my personal favorite, which is exclusionary or exclusive contract, which is the whole point of a contract, as if that’s implying exactly the same thing—some sort of imbalance of power there. I think it is important, though, to look at how does the public — how you would give substantive comments when it’s all over the place. And also, is the agency just fishing for — if you put out 95 questions, there’s someone who’s going to submit something that is exactly what you would like to do. And then, if you narrow the rule down into that direction, then it, technically, is a logical outgrowth of that process. If you ask enough people—and that includes those you agree with—you can cherry pick more easily.
And going back to the drawing board, though — first, going back to, I guess, Congress and the FTC dynamic, I think one thing that’s being discussed a lot is, “Does this deter Congress from passing a bill?” And I think, on one hand, just the fact that they’re doing it does, but also privacy bills are uniquely hard to write. And I learned this when I was on the Uniform Law Commission’s drafting committee—their personal data protection law that’s now out for state adoption—and it is challenging. I mean, the first thing I feel like you have to start with carefully defined covered entities, and that part’s not as easy as it sounds at all.
So what does this mean for what can be going on in Congress? What are the next steps for the FTC in this?
Neil Chilson: So it’s called an Advance Notice — I’ll tackle that last part. It’s called an Advance Notice of Proposed Rulemaking because it’s supposed to be followed next by a Notice of Proposed Rulemaking. I think, in some ways, the NPR has to be narrower. It would be hard to imagine that it could be broader than this current — the first one, which is so broad. I think the Commission, to your point, will have a broad record, although not a complete record. My point is, earlier, that the challenge here is that the questioning is biased enough that it’s going to get a certain set of answers, either in reaction or comment to it, but it won’t — it will not get some basic information that it needs to have.
One of the key questions that’s not even in here — in fact, the word is not even used that much is, “What is privacy?” The Commission does not ask that. It does not even use the term privacy very often in the ANPR, which is very strange, I think, and I’m not quite sure why that is. I think, in some ways, it suggests that the agency is trying to break with its past approach to privacy enforcement. One of the clearest ways is that the Commission does not have — it does not cite — the ANPR does not even cite the sort of touchstone of privacy guidelines that the FTC had issued back in 2011/2012. This is the privacy report that the Commission issued after several years of open debate and workshops around the topic. That report is a very considered approach, both of what are the concerns that the Commission is trying to address, but also, what is its authority to do so, and how does it tee up?
It seems like the Commission, here, is trying to step away from that by not even citing it, as well as not really mentioning privacy. That raises another big legal problem because the Commission only has the authority—and I think, actually, Commissioner Rebecca Slaughter pointed this out. The Commission only has the authority to adopt rules for cases in which there is an established set of violations. And then the Commission can then say, “We’ve brought a bunch of cases in this space. Now we’re going to prohibit this specific practice.”
If the Commission is trying, here, to leave its long history — its long historical approach of enforcing privacy and do something different, well, then that weakens the legal case for adopting rules. And so, I do think that that is another reason that the Commission should go back and restart this. Now, if they don’t, however, there will be a bunch of procedural hoops that the Commission has to jump through. First will be an NPR. There will be some hearings where people can sign up to testify and public comment. And then, of course, the Commission will have to write a draft rule. In fact, I think, probably, they should do that in the NPRM, but I’m not sure what the actual legal requirements are around that. They certainly will have to write a draft rule and put it out for public comment before it goes into effect.
And so, I think we will see, to the extent that the Commission continues to move forward with this — it will be long, drawn-out process with several other opportunities for public comment.
Ashley Baker: And speaking of this being a grab bag of a lot of pretty leading questions, is this something we are seeing at the Federal Trade Commission and other context?
Neil Chilson: So the Commission has gone an unprecedented—well, back, at least, to the 70s—spree of rulemakings. Traditionally, the FTC, for the last 40 years or so, has very much been an enforcement agency, both on the consumer protection and the antitrust side. It does not write a lot of rules, although, as you noted at the top, Congress has occasionally specifically requested that the agency write rules on some areas and given it what is known as traditional APA rulemaking authority.
So rulemaking is not something that the Commission has a long track record of doing successfully, but all of a sudden, they have a bunch of rulemakings happening, in particular—and I think you may be even more plugged into this one than I am—a recent proposal out of the agency to ban all non-compete contracts on the competition side, which is, I guess, in some ways, more specific than this particular proceeding—than the privacy proceeding. But it is a very broad, one-size-fits-all rule to a practice that no doubt can be abused but, often, also is an essential way to protect investment in employee training and intellectual property. The FTC has been doing a lot of this, and it’s interesting to see them stretch their legs in the rulemaking space, which is not something that they have a lot of deep experience doing.
Ashley Baker: As an aside about the non-compete notice, I would say it is just as broad, just in a very different way. And this is — the privacy notice is very broad, in that they’re contemplating a lot of ideas, and you don’t have a clear direction of what they’re doing. Whereas the non-compete agreement, though — but it does apply to whether or not you’re a low, hourly wage worker or a senior executive whose job relies on executing trade secrets. And then it also voids past non-compete agreements, which I think that’s actually the really egregious part of it as well. And then, also, it’s something that has been left to the states for reasons that make a lot of sense.
So I think they do have kind of the same approach, though, of saying, “Here’s something we’re thinking about. Give us some examples of why it’s good.” And it’s hard to work from that—to comment in a substantive way. And getting back, though, to the FTC authority issue, specifically, and the context here is — another reason it makes less sense for this to be done the way that they seem to be proposing is that if it’s operating solely under its [inaudible 21:15] authority, then you have very limited remedial authority, which means that it can, basically, just solely issue cease and desist orders and punish parties for subsequently—as you were discussing earlier—violating these orders and then that limited remedial authority kind of balances out the broad discretionary authority they have to make that initial determination of what that could fall under unfair [inaudible 21:39] practice.
But that’s a very — that’s not a very straightforward way of enforcing. I mean, that’s — it’s not only unclear and confusing but also — I think there’s also some — I mean, I wouldn’t call them due process concerns in the strict legal sense, but they do bring up those sorts of concerns, not really knowing what is in that area and what you can and cannot do and then they can kind of just sweep in with a cease and desist and having clear rules written by Congress. I mean, that’s better for everyone all around. Right? Because then it also helps the consumer more, too, in that it is remedied earlier.
Neil Chilson: Yeah. I think that’s a great point. I mean, as Jim and I point out in our comment, privacy itself — when people talk about privacy, they’re typically referring to at least eight different interests or values: fairness, personal security, financial security, peace and quiet, autonomy, integrity against commodification, reputation, and the ability to control others’ access to your information. And so, those are all very different interests, and to your point, that means it’s very hard to write a clear law around this, which, when Congress is doing it, it’s a practical problem. Right? Congress is trying to write a clear law, but the legal barriers are less. The due process concerns, when you cannot write a clear law or when you’re doing a poor job of it in the rulemaking context, are less controlled by the political interests and so — sorry, I should say, less balanced by the political interests.
There’s less accountability when a rulemaking agency or when an agency is writing an unclear rule than when Congress does. And so, I think there’s a heightened concern. It’s difficult to write a privacy law in Congress. It’s difficult in an agency as well, but at least Congress has a clear political check—but they do write poor ones.
One other thing that’s particularly interesting, I think, about the ANPR, and that might be — I think you raised a really good point around limited remedies. One of the reasons that the Commission does want to rule—I should say, though—is that under its UDAAP authority right now, it is limited for fining for violations until after it’s brought an initial case and then Commission — or then a company violates the consent order that’s under that. Now, it can recover damages to consumers in those initial instances, but it can’t impose penalties for violating the law.
And so, I think one of the reasons the Commission wants a rule in this space is that it would like to impose penalties. But when you impose penalties, that raises — courts have a much higher standard for what is the notice that you have to give people about potentially violating the law. And judging from this ANPR, with its extremely broad scope that, I guess, at least in theory, could make it a violation of the law to take your consumer’s address to provide them the product they ordered, it’s not clear that we’re going to get a clear rule that would meet that type of proper notice requirement that’s required by the law.
Ashley Baker: And courts do have to factor in harm to consumers. And that’s what would become problematic here. I would say a lot of these things that they’re citing as an examples are — there’s a range of things, such as your social security or highly personal information being leaked, to things that you just don’t feel good about but don’t cause concrete injury. And I think that would cause a lot of problems down the road.
And more broadly, I feel like over the past ten years or so, those in the FTC and Congress—and this is on both sides of the aisle—there’s been this shift of thinking about how to approach a privacy law and that it’s more about, “How do I feel about various uses of my data?”—and that can very much be in a wide range of spectrum of what people are okay with—versus — it used to be more about what the company or the covered entity — what, specifically, they did with that data. Look at the privacy act that the government has to comply to, which I think does a pretty good job setting forth those definitions, in terms of — yet it creates additional data maintenance category.
And there is a difference between what you’re able to do with data and whether or not just process it or you store or you actually maintain a data set, whereas privacy preferences are so incredibly subjective, and they don’t take into account tradeoffs. And that’s the one thing that I point to in my comments is there’s a lot of public polling done about — “In this scenario, do you feel that your data could be shared?” But then, when a tradeoff is presented, is it fair for them to have this information, but in exchange you get free access to email services? That’s just a random example. I don’t remember the survey questions. They are almost always able to give up those subjective privacy preferences, which from individual to individual very much varies.
Neil Chilson: Yeah. Yeah. It’s always my — it’s my thesis around online privacy that we have bad mental models for what happens when we browse the internet. I think most people think that when they’re sitting on their couch browsing the internet that it’s somehow happening inside their computer, and therefore, they think of it as somewhat private. But in reality, they’re trumping around on other people’s computers, they’re usually using them for free, and it’s the expectation that the people whose computers it is wouldn’t notice that you’re using them is somewhat surprising if you actually understand how the internet’s working.
So that’s not to say that all of that data should be gathered and used or that companies don’t have an interest in making sure consumers aren’t worried about how their data is used. Companies should try to build trust with consumers, and I think that’s largely what they try to do in being clear in their privacy policies. But that’s very different than the physical space, where we have much better intuitions for when our data is open to other people and when it isn’t.
But that’s a big-picture theoretical question. It’s the type of question, however, that it would have been nice for an ANPR to ask, “What is really going on out there, and how are people concerned about it?” And I don’t think the ANPR really gets into that very well. They dive in pretty quickly to some presumed harms or presumed types of harmful uses and then ask for a bunch of examples to help shore up a case for whatever rule they want to do in the future.
Ashley Baker: It’s an interesting point that our mental model about privacy and surveillance — it’s reflected in the term itself of commercial surveillance as if it’s implying there’s someone from Google standing behind me looking over my shoulder as I’m doing my work all day, and that’s very much not how that at all works.
Neil Chilson: Yeah. I mean, one of the things that we point out is that the use of the term surveillance here, in some ways, dilutes its effectiveness for talking about areas in which there are very clear power dynamics. So when we’re talking about law enforcement or national security, which have strong interests, obviously, in understanding the world — but those are situations in which surveillance occurs. That’s the traditional usage of the term, and in those cases, the power dynamic—the power differential—is enormous, and it could — when it goes wrong, could result in death. Right?
And so, by using the term “surveillance” to talk about things like collecting your address to ship you something, they’re really diluting the term in a way that I think is harmful to what I think they would — I think most people would agree is a good goal of making sure that we have proper limits on surveillance of the kind that law enforcement or national security can do. So I do worry about that linguistic effect of the agency’s word choice here.
Ashley Baker: Especially when it’s surveillance that’s authorized, which most actual intelligence surveillance is. But using the term surveillance in order to conduct surveillance, you usually have to get a court order and demonstrate probable cause or something similar, depending on what set of circumstances we’re talking about. But it takes quite a bit to get from A to B—that’s not just owning a company that processes data. These are really very much apples and oranges.
Neil Chilson: Right. So you mentioned earlier the difficulty of responding to this ANPR, in part, because of its broad nature and the many, many questions in this biased version. The comment that I did with Jim—again, we did a very high-level look at the approach and critiqued that. But then I had another more fun comment that I did separately where I used GPT-3, which is the — this is before ChatGPT came out, but it was the — it’s the large language model that underlies ChatGPT. And I used it to generate comments to the FTC on this ANPR from the point of view of AI bots who are concerned about having data — having this rule cut off their access to data that they need to learn more. So it was pretty tongue-in-cheek—kind of just a fun experiment. But I do think there is a real concern.
I think people do not think very hard about the tradeoffs between limiting data collection at a very governmental level—at a high level—and our ability to learn new things about the world around us. I mean, typically, understanding the world around us, including understanding other people better, is a great creator of value, and it — obviously, it can be misused. No doubt it can be misused, like all powerful tools, but we — I think people don’t think hard enough about what it exactly means for us to become more willfully ignorant about people around us under a set of rules that are set by Congress or perhaps by the FTC if they get their way here.
And while those effects might be — I don’t know what the real effects would be for AI bots, although ChatGPT was created largely from data collected from the internet, but I don’t know what the effects would be there. I do think that there would be strong effects if we had a much more constrained and limited ability to pay attention to who is using the website I run or who is shopping or searching for this item that I might be able to help them find. And so, I do think that there really can be a tension between privacy regulation and the ability to provide useful services to consumers.
I think the New York Times even had an article this week. It was on “Why are everybody’s ads so bad now?” And some of the ad ecosystem experts were pointing out some of the recent developments, both in companies and in the policy space, including Europe, that have made it harder to deliver the kinds of ads that people want to see because you don’t understand who they are. And so, you don’t know what kind of ads they do want to see.
So anyways, I think some of those effects are starting to become a little bit more clear, but the ANPR certainly does not ask any questions about what might be the downsides to limiting information in this space. Or if they do, they only cursorily offer those types of, “Hey. Let us know how this could go wrong.” I wish they had done some deeper thinking into eliciting good comments from people about what might be the tradeoffs to various approaches to privacy.
Ashley Baker: It does raise the concern of if you limit the data set in this wave with machine learning, that does give rise to some of the problems that the same people cite in the artificial intelligence policy context.
Neil Chilson: Right. Well, I mean, one thing — this Commission and many other advocates in the privacy space are concerned about bias and in AIs, and anything that shapes the data set will shape what the data is that goes into these AIs. And so, it’s interesting to think about what the potential tension is there between privacy and having adequate balanced data sets to train AIs.
Ashley Baker: Any final thoughts here about what the Commission might do next or what — if Congress will try to take charge of this? Any predictions there? And also, what should people who are listening, who might want to weigh in when there is a Notice of Proposed Rulemaking, what should they do there, and when might that happen?
Neil Chilson: So the agency has a ton on its plate. It seems like it keeps piling more on every day. So I don’t know what the timeline is on this proceeding. You could certainly follow me on Twitter if you want to know when there might be opportunities to comment or follow the FTC. They will also let you know but with less snark, probably.
So I don’t know what the timeline looks like here. This will be a long process, and given the Commission’s not only large docket of issues — but also the chair has a particular interest on the competition side of the FTC’s authority, and is less interested, I think, in this particular side of the FTC’s authority, so it could be a while. I’m not making any bets. I wouldn’t bet on that, but I would think that it could be a while before the next step in this proceeding happens. But when it does, there will be lots of comments, and it will be interesting to see if they take Jim and I’s advice to scrap it and start over. I suspect they will not, but who knows. Maybe sense will prevail. That’s all I have.
Ashley Baker: Sorry. I did turn off all of my alerts. I did not realize my dog would see a squirrel out the window and go insane. So I’ve been pushing him out here, and Ryan’s not at home right now, so don’t have my usual dog stopper.
Neil Chilson: No worries. No worries. We just…
Ashley Baker: I think that’s the right note to end it on now. I mean, I feel like there’s nothing to really add after that.
Neil Chilson: All right. Great. Cool.
Chayila Kleist: We’ll wrap it there. Thank you so much for joining us today, Ms. Baker and Mr. Chilson. I really appreciate your time and you sharing your expertise and insight.
Conclusion: On behalf of The Federalist Society’s Regulatory Transparency Project, thanks for tuning in to the Fourth Branch podcast. To catch every new episode when it’s released, you can subscribe on Apple Podcasts, Google Play, and Spreaker. For the latest from RTP, please visit our website at www.regproject.org.
This has been a FedSoc audio production.
Director of Public Policy
Committee for Justice
Senior Research Fellow
Center for Growth and Opportunity
The Federalist Society and Regulatory Transparency Project take no position on particular legal or public policy matters. All expressions of opinion are those of the speaker(s). To join the debate, please email us at [email protected].