Deep Dive Episode 131 – Free Speech in the Digital Era: Section 230 and the FCC
Section 230 of the Communications Decency Act provides liability protection to platforms, internet service providers, and other online intermediaries for third-party content they host or republish. It also provides liability protections for actions taken “in good faith” by such entities to moderate content. Section 230 has recently come under scrutiny from President Trump, members of Congress, and others who have raised questions about the appropriateness of these protections and their continued viability “in the Age of Twitter.”
In May, President Trump issued an Executive Order that directed the National Telecommunications and Information Administration (NTIA) to file a petition for rulemaking with the Federal Communications Commission (FCC) proposing regulations to clarify the scope of Section 230. The FCC is currently soliciting public comment on the NTIA petition, which was filed on July 27.
In this live podcast, panelists discuss the background of Section 230 and reflect on whether it continues to encourage innovation and free speech online, or if changes are needed. What should the FCC do to address the pending NTIA petition? And, in light of the upcoming elections, what are the political dynamics at play-at the FCC, in Congress, and in the White House?
Although this transcript is largely accurate, in some cases it could be incomplete or inaccurate due to inaudible passages or transcription errors.
Dean Reuter: Welcome to Teleforum, a podcast of The Federalist Society’s Practice Groups. I’m Dean Reuter, Vice President, General Counsel, and Director of Practice Groups at The Federalist Society. For exclusive access to live recordings of practice group teleforum calls, become a Federalist Society member today at www.fedsoc.org.
Nick Marr: Welcome to The Federalist Society’s teleforum conference call. This afternoon, August 24, 2020, we’ll be discussing with a panel of distinguished experts “Free Speech in the Digital Era: Section 230 and the Federal Communications Commission.” My name is Nick Marr. I am Assistant Director of Practice Groups at The Federalist Society.
As always, please note that all expressions of opinion on today’s call are those of the experts.
We’re fortunate to have with us today as moderator Jamie Susskind, the Vice President of Policy and Regulatory Affairs at the Consumer Technology Association and former Chief of Staff to FCC Commissioner Carr. Jamie, before I hand it over to you for the rest of the introductions, I’d like to let our audience know that we’re planning to have about 10 minutes after Acting Commissioner Candeub’s opening remarks for audience questions, so have those in mind for when we get to that portion of the call. Okay, Jamie, the floor is yours.
Jamie Susskind: Thank you, Nick. Good afternoon, everyone, and thank you for joining us today. I’m pleased to welcome our distinguished speakers who will share their views on Section 230 and the various actions happening in the Executive Branch and Congress in this space.
So first let me introduce Adam Candeub, Acting Assistant Secretary of Commerce for Communications and Information at NTIA. Assistant Secretary Candeub will be making his brief remarks, and then, as Nick said, he’ll take questions from participants on this call. So Acting Assistant Secretary Candeub joined NTIA in April 2020 as Deputy Assistant Secretary. Prior to his NTIA appointment, he was Professor of Law at Michigan State University College of Law, joining the faculty in 2004. And he also served as Director of its Intellectual Property, Information, and Communications Law Program.
His earlier government service includes work at the Federal Communications Commission in its Media Bureau and Wireline Competition Bureau — my alma mater also — the Competitive Pricing Division. He was as associate at Jones, Day, Reavis & Pogue. He holds a J.D. magna cum laude from Penn Law and a B.A. magna cum laude from Yale University.
Immediately following law school, he clerked for Chief Judge J. Clifford Wallace on the U.S. Court of Appeals for the Ninth Circuit. While in law school, Professor Candeub was an articles editor for the University of Pennsylvania Law Review. He received a Fulbright Fellowship in 2010-2011 to teach and research internet law at the University of Rijeka, Croatia. He’s widely published in the areas of communications regulation and antitrust.
So I’ll now hand it over to Mr. Acting Assistant Secretary Candeub.
Hon. Adam Candeub: Well, thank you so much, Ms. Susskind. And thank you, Nick Marr, and The Federalist Society for organizing this event. It’s always a pleasure to talk about one of my favorite subjects, Section 230 of the Communications Decency Act. But for better or worse, Section 230, like most telecommunication statues, reflects a long and complicated regulatory history.
So to start out, start this discussion, I think it might be a good idea just to review the origins of Section 230, origins that take us back to the internet of the 1990s. In the 1990s, the mid-‘90s, with the development of the world wide web protocol, the internet went mainstream. And as more and more Americans connected to the internet, there was a widespread recognition of how empowering this tool could be. Gatekeepers were pushed aside, and people were realizing the freedom of expression in new and exciting ways.
I think many of you on this call will remember the words of the poet John Perry Barlow who captured this giddy sense of freedom in a declaration of internet freedom, and described the medium as liberating people from, quote, “property, expression, identity, movement, and context” so as to allow them to, quote, “spread across the planet so that no one could arrest their thoughts,” end quote.
But from the beginning, there were challenges in keeping the forum free as well as friendly to families and children. This concern led to the passage of the Communications Decency Act in 1996, which was codified into the Telecommunications Act in 1996. Congress and the CDA — that’s the acronym for the Communications Decency Act — wished to create spaces on the internet that were suitable for children and families without those platforms becoming publishers for the content that they moderated, and thus assuming a huge amount of liability for all of the [inaudible 05:03].
At the same time, Congress wanted to encourage openness and allow for bulletin boards and other similar fora to flourish. This meant preserving the rules set forth in the famous Cubby decision that platforms that simply post but do not content moderate user generated content do not face liability for their posts’ content.
When Congress passed Section 230, its vision, therefore, was to achieve both goals. And it did so in the following way. First, Section 230(c)(2) grants forums the ability to moderate user posts without taking on liability that traditional publishers would face. Thus, these sites can remove explicit, violent, and harassing content and still keep their legal protections. Second, to allow for the development of open fora, Section 230(c)(1) protects internet platforms from liability generated from third-party user created content. This simple idea led to an internet where everyone had a voice and platforms where everyone felt welcome.
This vision of an open internet, which Congress cemented into law in 1996, is as powerful now as it was then. So what changed? First, the emergence and consolidation of social media companies has meant that a handful of platforms now have extraordinary power over the ability of Americans to express their views. Leaning on the ambiguities of Section 230, platforms have argued that they have complete immunity for any editorial decision, a view that a handful of district courts, perhaps in dicta, have seemed to adopt it.
With this extraordinary legal immunity, platforms have shaped events to fit certain narratives, deleted or otherwise disappeared information without notice, and control what people see and do not see. It is a sad irony that a law meant to promote openness on the internet instead delivered online speech into the care of private censors, leaving many silenced.
In May, with this state of affairs in mind, President Trump issued his Executive Order on Preventing Online Censorship. In this order, President Trump was unequivocal in his commitment to free and open debate on the internet. But he raised several troubling threats to this central value on the internet today. He pointed out that users of major social media services have been reporting that their posts have been flagged as inappropriate, despite not violating any terms of service. Without explanation, platforms have made changes to their policies to disfavor certain viewpoints, and some have even placed warning labels on users’ posts in a manner that clearly reflects political bias.
The President’s order sought increased transparency and accountability from online platforms. The goal for the administration is to protect and preserve the integrity of American political discourse. To that end, the order tasks NTIA, the agency I lead, with petitioning the Federal Communications to clarify Section 230’s ambiguities and return it to its original purpose, to encourage openness and to allow for content moderation in the service of creating an environment welcoming to children and families.
In our petition, we asked the FCC to use its authority to make three things clear with regard to Section 230. First, Section 230(c)(1) and (c)(2) must be properly distinguished and delimited. Section (c)(1) is short. It’s 26 words. It simply says, quote, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider,” end quote. But what that means is not quite clear.
I think there’s three possible meanings for what types of liability Section 230(c)(1) removes. The first one I think is correct, the most natural one, the one that derives closest from the meaning of the text and is closest to the intent of the Congress that passed it, that 230 protects platforms for liability generated by third-party users, namely the libel, criminal threat, obscenity that results from third-party posts. However, you could also say, or some people do, that 230 protects platforms for removing content, for deplatforming people, from editing content. Or third, you could say that removes liability for creating unreasonable conditions of access.
Following the judicial scholarly consensus goes for the first view, that (c)(1) protects third-party content existing on platforms, not decisions to remove content or conditions to allow it on. Thus, (c)(1) protects platforms against liability for obscenity, state criminal threat, libel, consumer fraud, but not decisions to remove content, content moderate, or deplatform individuals. Rather, Congress quite clearly, by the terms of this text, by the terms of the statute, said Section 230(c)(2) with its limited purposes and good faith requirements controls decisions to remove or edit content or deplatform individuals.
And of course, Section 230 does not address the conditions under which platforms may limit or restrict their platforms. That’s not to say they can’t. Of course, they have a First Amendment right to do so. But the rules of contract, the rules of civil rights continue to hold, and Section 230 does not protect against them. So that’s the first thing the petition does. It asks to clarify the meaning of Section 230’s protections.
Second, it aims to clarify Section 230(c)(1), deems social media firms to be publishers and thus outside its protections when they remove, promote, comment, or edit user content. That is clear. When social media firms act in this way, they are speakers and they do not qualify for Section 230 protection.
Third, Section 230 does not protect a social media platform that shapes and controls its overall content according to a discernable viewpoint. Thus, when a platform plays a decisive roll in promoting shaping content that it sees, it becomes a publisher in its own right and does not enjoy Section 230’s protections.
Finally, the petition also asks the FCC to impose disclosure requirements similar to those that broadband service providers such as AT&T and Verizon now face. These disclosure requirements, which Ajit Pai actually continued to impose upon broadband providers when he eliminated the network neutrality requirements several years ago, will allow for individuals to understand how content is used and circulated throughout the internet and will create a level playing field for internet regulation.
Taken as a whole, if the FCC acts on our petition, Section 230 would once again protect freedoms of expression while holding dominant platforms accountable for their editorial decisions.
Before I turn things over to the panel and to the questioners, I would like to address two critiques of our petition that have emerged since we sent it to the FCC last month. The first is that somehow the FCC doesn’t have authority to promulgate rules, to resolve ambiguities in Section 230. Case law on this is clear. Section 201(b) of the Communications Act empowers the Commission to prescribe any necessary rules and regulations to carry out the act.
And the Supreme Court has held that this rulemaking power extends to all subsequently enacted provisions of the act, specifically identifying those added by the Telecommunications Act of 1996. This rule was established in AT&T Corporation v. Iowa Facilities in 1999 and cemented in City of Arlington v. FCC in 2013. Those decisions established that 201(b) gives the FCC the power to clarify and implement any section that Congress chose to codify in the ‘34 Act. Further, in both legislative and tech history, Congress made no suggestion of any intent to preclude the FCC’s ability to implement Section 230. We can dispense with the idea that the FCC doesn’t have the ability to do what we’re asking.
The second claim that has arisen is that somehow our petition violates the First Amendment of the giant social media companies. Indeed, I think that was a question implicitly present, at least in the title of today’s panel. But that is not correct. Our petition in no way seeks to control the free expression rights of internet platforms or condition those rights on neutrality or fairness, as some people have said.
The entire debate is around the special legal protections granted to them under Section 230. No one, no platform has a First Amendment right to these protections. Newspapers can’t claim these protections, neither do cable systems, book publishers, or other forms of traditional media. Section 230 protections are granted to online platforms when they meet certain criteria. It’s a quid pro quo. And the petition is simply asking the FCC to make sure there’s enough quo for the quid. That the First Amendment has been raised in this manner I think is a red herring designed to obscure those whose rights are actually being violated when social media companies decide to selectively censor political speech.
As President Trump said in his executive order, “All Americans should have a voice in today’s world of digital communications. Democracy and free speech mean nothing if our dominant networks silence certain people.” As you all know, the FCC has invited public input on our petition, and I encourage all interested parties to share their views during this comment period. The intent and meaning of Section 230 has been muddled for too long. The FCC has the power to change that. We are eager to see further action on this important issue after the public has weighed in. And I thank you for your time.
Nick Marr: Great, thank you. Our first caller, area code 310, you have the floor.
Mitchell Keiter: Hi. Mitchell Keiter calling. First off, please direct us how to make that comment, please, the way to do that.
My question is whether it’s feasible to put social media organizations to a choice. For example, with schools, they’re either public, in which case they get a great deal of funding, or if they’re private, they won’t get the funding, but they have autonomy in matters, let’s say, of religion. Conceivably, that’s also a possibility regarding the debate over employees and independent contractors. People can select their own category.
Would it be feasible to have a social media organization say, “We are an open forum, and we’ll accept everything; and therefore, we’re not allowed to censor, but we have immunity,” or, “No, we’re a publisher, in which case we can do what we want with editing, but we will be liable for defamation.” Is that a possibility?
Hon. Adam Candeub: Yes. I think, in a way, that’s what our petition asks. Right now, we’re in this weird netherworld in which social media companies can say, “Oh, we’re not publishers, but we are doing all this editing and content moderation, and we get Section 230(c)(1) protection.” I think what the petition does is asking the FCC to make clear that if you’re going to have a point of view, that’s fine. But what you can’t do is have a sub rosa point of view and claim that your editing and content moderation and deplatforming is not expressive of a public point of view.
Nick Marr: Great. We’ll go to our next question now.
Caller 2: Hi. I was hoping you’d be able to draw some amount of comparison between the last major debate I thought we had on this rough issue was the debate around neutrality. Clearly there, we had the opposite situation that many of the same people in groups arguing that a new regulation to protect free — that an added regulation can have free speech implications and then saying that, say, the removal of a regulation would lead to censorship. And to my knowledge, anyway, we didn’t really see that play out with the withdrawal of common carrier status. But I was hoping that maybe — could you compare the two from a legal perspective?
Hon. Adam Candeub: Sure. I actually think they’re quite different. Network neutrality was, in fact, a regulatory imposition. It was a requirement of nondiscrimination on platforms. Section 230 is sort of a corporatist gift already. It’s giving certain — well, not corporatist. Let me retract that. It’s a gift, a liability gift to certain types of internet firms. And all the petition does is ask that the conditions under which this gift is given are lived up to.
It doesn’t really require the imposition of additional regulations. If firms don’t want to deal at all with the FCC’s rules that we hope they’ll promulgate pursuant to this petition, then there’s an easy way not to. You just don’t claim the protections of Section 230. It’s a quid pro quo. If you don’t want the quo, then just don’t take the quid.
Nick Marr: Okay, let’s go to our next caller now.
Peter (?): Hi. This is Peter [inaudible 18:45]. It seems like Section 230 actually does allow the companies to restrict content for a pretty open-ended array of whatever they find objectionable in good faith. So it seems like the good faith is the key word there. So I’m wondering if you can explain what in your view the good faith is supposed to be and how far the FCC can go in its clarification on this point.
Hon. Adam Candeub: Yeah, that’s an interesting point. So you’re talking about, of course, Section 230(c)(2). And I would actually maybe push back on the notion that it is an open-ended requirement, an open-ended ability to censor. And I’m — I have to get the text of the statute. Wait one second. I’m pulling it up. I apologize. I should have — I just had it, and it just came off the computer. Okay, hold on one second. Okay.
So the dissension that you’re saying is from (c)(2), and it says, “any action voluntarily taken in good faith to restrict access to or availability of material that a provider considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.” I agree that that has to be done in good faith, but I do disagree that that is somehow a permission to allow anything in the views of the platform to take off.
If you actually look at the words, this was part of a telecommunications statute, and these words have very specific meanings. Obscene, lewd, lascivious, and filthy come from the Comstock Act as amended in 1919, I think. And they specifically refer to sort of dirty material that you wouldn’t want children to look at. Excessively violent, again, it goes to the main — one of the concerns of the Communications Decency Act in the very title of Section 230 was codified in has to do with the V-chip. Congress at that time was very concerned about violent programming on television.
Finally, harassing actually refers, if you look at Section 223 of Title 18, it refers to harassing telephone calls which are illegal of the type where you repeatedly call people or crank calls. Kids would call up and say, “Is your refrigerator running?” And the poor victim would say, “Yes.” And the children would say, “Ha ha ha! You better go running and catch it.” And so these harassing phone calls are illegal.
So taken as a whole, I think that actually to really only limit only gives the power to edit and content moderate for the reasons to create family friendly environments of the type that media regulation foresaw in the mid-‘90s. I think that’s probably closest to the congressional intent and to the text of the statute. Otherwise objectionable, I think, using the canon of ejusdem generis refers to things like that. And that’s the way courts generally interpret lists such as that. So I think that (c)(2) really only gives special protection if platforms limit this type of content and do so in good faith, meaning in a fair and reasonable manner.
Nick Marr: Let’s take one more question, and then we’ll go to our panel.
Allum Bokhari: Hi. Allum Bokhari here from Breitbart News. You mentioned the original congressional intent of the legislation to allow family friendly environments on the internet. So my question is about what the end goal here is in terms of maximizing user choice. So Google for a long time has had a feature called fake search which users can turn off or on, which essentially filters — there are certain filters, obscenity access search results. Would you say that something like that is the end goal here where instead of tech CEOs getting to determine what their users see and can’t see, users can instead determine that through a set of filters?
Hon. Adam Candeub: Absolutely. In fact, the petition does talk about that. The solution to this whole problem is to empower users to create their own environments. Certainly, the social network platforms have the power to give people the ability to block the sort of content that they don’t want to see. And that really is the solution. Why the platforms insist upon centralizing this power and controlling what people say and not allowing people to just see the sort of conversations they want to see, sort of infantilizing, perhaps even slightly totalitarian perspective.
And so I think that would be a wonderful end game. And I think the disclosure requirements that the petition has really are what will help us get to that place because if people are able to see what and how content is moderated and controlled on social media and the internet in general, I think that will go to creating tools that will help people do it themselves.
Nick Marr: Thank you, Secretary. And apologies to the audience members who are still in the question queue. We’re hoping to have a couple minutes at the end for some more audience questions, so keep those in mind and have them for the end of the call. Okay, Jamie, I’ll pass it over to you to start the panel discussion.
Jamie Susskind: Okay, thank you. I’m pleased to introduce our all-star panel for this afternoon’s discussion. I’m joined by Jon Adame, who’s the General Council for Senator Marsha Blackburn; Eric Goldman, Professor of Law and Co-Director of the High Tech Law Institute at Santa Clara University School of Law; and Ashkhen Kazaryan, Director of Civil Liberties at Tech Freedom. And Ash, I’m sorry if I butchered your name. I’ll try not to. So in the interest of time, I’m actually going to dispense with everyone’s full bios, but I would note that they’re posted on The Federalist Society’s website for this particular event, so feel free to look them up.
So I’m going to turn first to Ash. So we just heard from Assistant Secretary Candeub about the origins of Section 230 and the NTIA petition that’s now pending at the FCC. More broadly, I wanted to see if you had a reaction to some of the remarks, sort of a reaction to some of the points that Assistant Secretary Candeub raised about the arguments they’re making in the petition.
Ashkhen Kazaryan: Thank you, Jamie. And thank you, Secretary Candeub. There are a few things I want to clarify from the get-go. Number one is the congressional intent. So the congressional intent of CDA overall doesn’t matter because CDA was found to be unconstitutional by the Supreme Court. But Section 230 is the only surviving part of CDA. It was actually a separate bill that then was merged with CDA and then passed together as a compromise.
The authors of that bill, Republican Chris Cox and Democrat Ron Wyden, have been on record multiple times, including 1996, on the floor saying verbatim that they do not intend for Federal Communications Commission to interpret Section 230, and they do not want — Chris Cox said that he does not want for the Federal Communications Commission to turn into “a Federal Computer Commission with an army of bureaucrats,” and this is a quote, “regulating the Internet.” So that’s number one, and that’s addressing the intent behind the statute.
Now, number two would be about the jurisdiction of the FCC. And TechFreedom is going to file their detailed comments in this proceeding, so all those citations and everything I’ve been talking about can be found there and on our website. You can contact me if you want to look at it. So number one would be that both courts and the FCC itself have concluded that Section 230 confers no authority on the FCC. If anything, the FCC has typically relied on 230 as [inaudible 27:07]. No FCC has ever thought it could regulate an application layer, so the edge providers, and doing it with Section 230 would be an enormous break from 25 years of tradition.
The contortion of a good faith, by the way — I know there was a question about it — is beyond any reasonable reading of the statute. It’s like an elephant in a mousehole argument. Also, most importantly, and Professor Goldman can elaborate on this, but the attempted reading, all the connection between (c)(2)(a) and (c)(1) just doesn’t exist. And I am also relying here on 25 years of court precedent and regulatory precedent. So those are the few things that I wanted to clarify, and now I’m happy to turn it over to Professor Goldman.
Jamie Susskind: Yes, I think that would be great. Go ahead.
Prof. Eric Goldman: Yeah, Jamie, I just wanted to see if it’s okay for me to pipe up.
Jamie Susskind: Yeah, that would be great. So I was just going to highlight that you had posted on your blog — unfortunately, I don’t have the website handy, but you did post a fairly lengthy analysis of the NTIA petition. So I was curious if you could give some more thoughts about that to the folks who are calling in today.
Prof. Eric Goldman: Yeah. Thank you for the opportunity. And I blog at blog.ericgoldman.org, and I did do a many-thousand word deconstruction of the NTIA position, some of which overlaps with remarks that Acting Assistant Secretary Candeub made earlier today. He made some differing remarks as well which created other problems that the blog post doesn’t address.
But in my blog post, I basically explain that everything in the NTIA petition is fiction, both the factual characterizations of what was happening in the 1990s as well as the legal assessment of both what the statute says, its purpose, and its interpretation in the courts. So I don’t have time in this format to go through it point by point and talk about all the errors, but my blog post gives you a better crack at that. And I do encourage anyone who wants to see some of the structural deficiencies of the petition to take a look there.
I think that the biggest thing that baffles me about the petition is that it essentially ignores about a thousand cases that have interpreted Section 230 that have answered many of the questions that the petition puts into play. And so the petition just decides to strike its own course, untethered from what the discussion’s been like in the court system. It kind of cherry picks off a few things to try and stitch together a narrative, but it ignores really the guts of what has happened in the last 25 years in the courts.
And in particular, there’s been a lot of cases that have answered the question about the interplay between Section 230(c)(1) and Section 230(c)(2). It really says that Section 230(c)(1) was designed to cover all forms of editorial decision making, which include both leave-up and removal. They’re two sides of the same coin, really. And then Section 230(c)(2)(a) is designed to fill in some gaps. There are some ways in which that answer doesn’t provide a complete solution for websites engaged in content moderation. And so as a result, there’s a reason why the statutes are — how the statutes and their failure the petition just chooses to ignore, really, I think, to its detriment.
And generally, that makes me wonder exactly who is NTIA working for and why are they doing what they’re doing? To me, I would think that a government agency would want to tell the story of here’s the lay of the land, here’s all the different arguments that have been made, here’s what the courts have done, and then help the FCC understand, okay, these are the options that we’ve seen. Here’s maybe one that we prefer.
But it didn’t do that. It just — let’s tell what we prefer and make a fictional story, a narrative behind it. So to me, I think the structural problem with the NTIA is I don’t really understand what the NTIA is doing. Are they working for the people, or are they working for someone else? That fundamental question of what is our government doing with our tax dollars to advance our interest I think is unclear when they bypass all of the existing precedent that is relevant to the question.
There are two more last points that I want to make, and I’m sorry for taking up so much time here, but we did hear a lot of conversation already. I just want to mention that Section 230(c)(1) plays a really critical role in our society because it created a new legal structure that sites can engage in editorial discretion but not face the same liability that they would face in other media with the outcome that we see new kinds of content created on the internet that didn’t exist in the offline world. There are things that people are doing on the internet today because of Section 230(c)(1) that they weren’t doing in the offline world.
And so I fear that — I think that the NTIA petition is designed to actually target and eliminate those new, special things that we get on the internet. That makes it a very dangerous proposal. It’s not actually engaging with the fact that the internet has new potential capacities, and what does it take, what legal infrastructures does it take in order to preserve those.
And the last point I’ll make is a minor one but a critical one. Acting Assistant Secretary Candeub used this phrase today, “private censorship.” And that phrase is misleading at best. And honestly, I would encourage none of us to ever use it again. There is no such thing as private censorship. We talk about private entities deciding what third-party speech to allow or not. We call that editorial discretion, and that is what’s protected by the First Amendment under the freedom of press and speech. So the idea is that any time we’re talking about private censorship, we’re actually talking about something else that’s really an important constitutionally protected mechanism.
But it distracts us from the government engaging its own forms of censorship. And to me, that’s what we need to be working on. What we should be prioritizing is how our existing government is working on ways of engaging in censorship of us. That is a bigger threat. They have better power over us than any private entity does. They have the military and law enforcement power as well as the ability to take our tax dollars and use them against us. So every time we’re talking about private censorship, we’re actually not talking about the real threat in the room which is government censorship. And that’s something I hope we will turn to in this conversation.
Jamie Susskind: Great, thank you. So I want to make sure that I turn it over to Jon also. Jon, your boss, Senator Blackburn, has had a lot to say about the idea of censorship on social media platforms and Section 230, so I was hoping you could talk a bit about her views on this issue, any plans that she might have. And if she has thoughts on the NTIA petition at the FCC or the process that’s going on there, we’d be happy to hear it.
Jon Adame: Yeah, great. Thanks, Jamie. So Senator Blackburn’s been engaged on this issue for a while, and she’s had her own individual instances of censorship of Twitter and some of the other companies. So I think she takes this a little bit more personally that some other people. And we really take a lean view on this, and we’re trying to find a way to engage on this with DOJ and the White House. And we’ve been having several conversations from figuring out what kind of path forward would exist that we can do this year.
To that end, we’ve been working with the Chairman of Staff on Senate Judiciary and Commerce looking for a more narrow legislative fix that would increase accountability and find a way to give people a little bit more power in this equation. But it’s definitely a kind of a tricky balance, as you’ve heard from the panelists who’ve been talking today. And overall, on the NITA position, we think that it’s a good thing to be able to have this FCC opportunity to be able to build a record. Let the public put their thoughts out there, and then we can reassess what we’re going to do moving forward after that, definitely open to how other people see that.
Jamie Susskind: I did want to circle back. I’m not sure if Assistant Secretary Candeub had to go or if he’s still on the call. If he’s still on the call, I think we’d welcome any responses you have. If he’s gone, then we’ll just continue the conversation.
Hon. Adam Candeub: No, I’m happy to respond. First of all, getting back to — well, I’m very glad that Mr. Adame is supportive of what we’re trying to do at the FCC. As far as Professor Goldman’s remarks, I think that everything in the petition is largely consistent with most appellate decisions on this matter. I think it only takes issue with a few district court opinions, and it has nothing to do with censorship. I don’t really understand why it has anything to do with government censorship. It’s about clarifying a very generous liability gift to protect all internet users. I argue that’s just the — that would be my takeaway.
Jamie Susskind: And if anyone else has any thoughts, happy to have to chime in. If not, I’ve got more questions for you all.
Ashkhen Kazaryan: I will say one thing about the concept and the way it’s been messaged. Section 230 in the last year has been messaged as a gift to tech companies. In 1996, none of the tech companies which it claims to be a gift for existed. In 1996, it was about creating the right incentives for companies to moderate. It’s extremely — it’s impossible to moderate at a scale that current social media platforms and other platforms do. Opening them up to liability is not going to be the solution to it. So calling Section 230 a gift when all it did was enable creation and innovation and making the United States a world leader in tech sector, I would not agree with at all.
The other thing about this is that the big tech companies that are always in the news and are being accused of quote, unquote, “censorship” or playing favorites, all of them can survive Section 230 being amended or changing. They can do liability. They have the regulatory — they have the means to do that. What this is going to do is suppress the newcomers, the startups, the smaller websites because Section 230 doesn’t just protect Facebook and Twitter and Google. It protects everyone.
And by the way, it does protect newspapers because every newspaper who has a website and who has a comment section are also protected by Section 230. So I just wanted to clarify that as we go into the rest of this discussion.
Jamie Susskind: Great. So obviously the comment period is open at the FCC. I know Ash said that her organization is planning to comment. My organization will comment.
So I was curious if folks have thoughts about going forward what the FCC should do with the petition and perhaps thoughts about what you think they might do. Obviously, Commissioner O’Rielly is leaving at the end of the year. We all heard that news, so that could impact things and the number of commissioners that might be available to vote on the item. What are your thoughts on that going forward on next steps here? I’m opening it up, anyone, any panelist who has thoughts.
Ashkhen Kazaryan: All right, I’ll start off. But I don’t want to capitalize the time, so I’ll be very short. For the reasons that I’ve outlined which are basically 25 years of FCC precedent and 25 years of just overall court precedent, I don’t think — in my personal opinion, I don’t think FCC’s going to address the request in the NTIA petition. I obviously can be wrong.
I think FCC should go through appropriate process in addressing this because obviously there is a lot of attention, both political and policy attention, on this question. And I think it should be discussed. And I encourage everyone, no matter what your position is, to file comments because we should go through the appropriate steps of this office.
Jamie Susskind: All right. So looking forward, even past the FCC to later this year, the presidential election is coming. Obviously, the President has said his piece on Section 230 and social media platforms. And then Vice President Biden has also actually spoken up on Section 230. I have a quote. He said, “Section 230 should be revoked, immediately should be revoked, number one, for Zuckerberg and other platforms. It should be revoked because it’s not merely an internet company, but it’s propagating falsehoods they know to be false.”
So I wanted to get thoughts from all the panelists about where do you see the political momentum going as we move into election season with all of this? For Jon, is it coming from the Hill? What changes could be taking place there? Is it coming from the agencies still, depending on how things go? What are your thoughts?
Jon Adame: I think that’s pretty interesting context for this because if you look at it, this is ultimately going to have to be decided by Congress. But we’re in a position right now where the Republicans think that the companies are doing too much to be censoring people, and the Democrats think the opposite.
That kind of inertia is going to be difficult to find in this time. I think you’ve seen a smattering of proposals that have been out there from various members, more recently on the Commerce Committee, that are interested in finding a way forward, but it seems like we’re still relatively stalled.
Prof. Eric Goldman: Jamie, if I can speak up on that question, there are a number of proposals in Congress to reform Section 230 ranging from big, structural changes to Section 230 to basically what special interest legislation such as the act that’s designed to benefit landlords and hotels. So there’s a lot in play in Congress. And it creates a very dangerous ecosystem for Section 230.
And I think Ash had a really important point about Section 230 is a gift. Section 230 is a gift to all of us. It’s a gift that allows us to have conversations online as speakers without having to have the kind of — to navigate the kind of media ecosystem that we had pre-internet. And so I think we’re in the fight of our lives for Section 230 right now, and I’m hoping that the listeners to this call will take a moment to reflect upon what is it that they value out of the internet? What does Section 230 do to contribute to that? And how dangerous is the environment that all that could be at risk?
So you’ve got the left, the right. They can all agree that censorship by the government is a really good thing, and that creates an ecosystem where there’s really weird deals that could be struck to undermine Section 230 that are not really in the internet users’ interest. They’re really an exertion of government power. And I hope that scares all of us.
Ashkhen Kazaryan: There’s two points I would want to make in answering your question. Number one is that if we look at the one law that was passed and did amend Section 230, which was FOSTA-SESTA, Stop Enabling Sex Traffickers Act, it obviously had a lot of congressional support because the cause, the reason why we would want to pass that law of stopping sex trafficking is not anyone in their right mind would want to oppose.
However, as TechFreedom and many other academics and nonprofits and experts have warned, what the law did — and now it’s been two and almost a half years since it’s passed — was it shut down parts of the internet. It hurt stopping sex trafficking because sex trafficking now has moved on from platforms that willingly collaborated with law enforcement and wanted not to have any of that horrible materials and things on their platform into the dark web. And law enforcement doesn’t have resources to investigate that.
It also has been used by prosecutors only once about a month ago in two and a half years since it was passed. I’m not an expert on prosecution or sex trafficking crimes, but I am pretty sure that in two and a half years there has been way more crimes online, many more sex trafficking crimes that were conducted online than one. But this law did nothing but hurt the internet ecosystem. So we have to keep that in mind as we discuss any potential amendments to Section 230.
Now, I have never said that Section 230 was obsolete or untouchable, but I have yet to see a proposal that addresses and amends it in a way that’s not going to hurt the public discourse online. If you look at the tape of the antitrust hearing at the House Judiciary Subcommittee about a month ago, you can see that those members, Democratic members and Republican members, are unhappy with how moderation is conducted by platforms. And there were questions that were asked to the CEOs of those companies.
This only means that it’s not just the Republicans or conservatives who are unhappy with the content moderation happening. It’s also the Democrats. Pressuring tech companies into basically doing whatever you want to do might end up hurting the very, very sensitive internet ecosystem that we have.
And please let’s not forget that because of these platforms, we have had freedom and speech and support of opposition spread not just in the United States but all around the world. And that’s what we have to keep in mind every time we talk about amendments to Section 230. It’s the law that made the internet as we know it possible.
Jamie Susskind: Thank you, everyone. So one quick question before we move to audience Q&A. How do we have a productive conversation about Section 230 where everybody is not angry at each other because I’ve been engaged on this for years, and I think that for better or worse, all sides feel very strongly about where they are, which I don’t think is wrong. But I’m interested in your thoughts about how we have productive conversations about the future of internet speech going forward and what the right policies are without folks talking past each other.
Hon. Adam Candeub: Yeah, I’ll take a stab at that. So one thing is I think we have to be realistic about the effects of these changes. So for instance, I’m listening and hearing the petition described, and I don’t know what people are talking about. Nothing in the petition disregards 20 years of jurisprudence. It’s quite consistent with jurisprudence. Nothing in the petition is interested in destroying the internet as we know it or destroying free speech. It’s about introducing a little bit of transparency and accountability and user choice.
And so it just amazes me how quickly people are willing to go into these really dire results, and portrayals of extremism, and very little reality. Claims of government censorship from suggestions to reforms of Section 230 seem polarizing and unhelpful.
Jamie Susskind: Anyone else want to chime in?
Prof. Eric Goldman: I do. Just a couple of thoughts. So much of the discussion of Section 230 has been littered with false information about what Section 230 does, things like NTIA petition — my apologies to our esteemed speaker — actually exacerbates this problem, creating a false narrative about the law and the facts that actually takes us away from really having a good, healthy conversation. So I was hoping that a call like this would actually up-level the discourse so that we do understand exactly what the law says, what it does, and what opportunities there are to figure out how to balance competing interests. That’s how we have the intelligent conversation.
But I don’t know how we’re going to get there as long as we continue to see people misrepresent Section 230 in very powerful ways. One of the most common misrepresentations, not that the NTIA petition does this as much as others, but I want to bring it up here because this has been such a structural problem in the discussion is that there’s a conflation between what is protected by Section 230 and what’s protected by the First Amendment. So if we change Section 230, but the First Amendment dictates the same legal outcome, then we haven’t actually made any progress towards any goal. All we’ve done is make Section 230 worse.
And so there’s a lot of misunderstanding. Where Section 230 is kind of a placeholder for bad content, even if that content is protected by the First Amendment, in which case sixing Section 230 eliminates a bad content. It just simply reshuffles the legal deck.
The other thing that I think would help improve the discourse is to abandon any hope that there’s a win/win/win outcome for all of us, that all we have to do is just get everyone in a room together and we’re going to come away with everyone getting exactly what they want. Content moderation is a win/lose game. Somebody gets what they want, and somebody doesn’t. And that person who didn’t get what they want is going to be grouchy about it.
So when we talk about what is the best path forward, the best path forward isn’t some kind of magical fantasy option of developing an outcome that’s going to categorically make things better off. The options that we’re going to pick between are which are the least worst option? Which is the option that is going to do the least harm to the community?
And if we rephrase the options that way, that we know we’re going to be talking to each other, and some people are going to engage in bad behavior in that process, and so we’re going to look for the way to come up with the least worst option, we might conclude, actually, that Section 230 is the least worst option as of this day. But we’ll never conclude that as long as we’re looking for some option that’s supposed to be better than all current options.
Jamie Susskind: So I’d like to try to find a few minutes for audience questions if we can. So Nick, do you want to cue up the next six minutes or so?
Nick Marr: Yeah, we can go to audience questions.
Caller 5: Hi, yes. I’m online. So I looked at the NTIA petition earlier and I saw it made various references to the providers’ terms of service. And I’m wondering, as the current law stands, is Section 230 — can it be raised in a defense if a user brings a terms of service breach claim against a social media provider?
Hon. Adam Candeub: Yes, it does. And I think part of the petition is to ask the FCC to clarify that. It’s asking — it’s saying, look, if the social media firm makes a promise to its users, you should be — it should be held accountable for that. It’s a fairly non-controversial view. And I think that that’s something the FCC could clarify and would help all of us. Section 230 doesn’t absolve anyone of their promises.
Prof. Eric Goldman: Unfortunately — I’m sorry, Acting Assistant Secretary Candeub, if I cut you off there. Did you want to finish your thought?
Hon. Adam Candeub: No. Go ahead, please.
Prof. Eric Goldman: So my take on the law is that actually, there’s a split in the case law in this very question about when Section 230 can apply to a breach of contract claim that’s based on terms of service. In general, actually, I agree with our esteemed speaker about the fact that there are times when the services should be accountable for what they say in their terms of service.
However, we’ve also seen efforts to try to weaponize services costs against it to try to basically do an end run on Section 230. And we see tendentious, unpersuasive, and frankly, just frivolous readings of the costs as a way to get around Section 230, saying I’m suing for this breach of costs, but what I’m really suing over is I want to hold the site accountable for third-party content. In those circumstances, some courts do apply Section 230. And honestly, I think that’s a good thing.
Ashkhen Kazaryan: I would want to just bring up a few examples as an analogy. In 2004, I believe, and then later, the liberal, let’s say, branch of our political activists have asked FCC to go after false news because they were not keeping their promise of being quote, unquote, “fair and balanced.” And what FCC chairman did back then is said that he was not in the business of regulating speech.
As Professor Goldman and Secretary Candeub mentioned, there are some cases, some very specific cases that courts can look at this. But overall, often I’ve seen in the last few years calls to use whatever is in the contract moderation policy of the company to go after them if their decision was not to their liking. That this is very close, and that is very dangerous territory to regulating speech.
Nick Marr: Okay, we’ll go to our next question now. Let’s see, this will be our last question of the day. Apologies to the callers that are left in the queue.
Caller 6: Hi. I just had a question for Professor Goldman. In the comments that he made earlier, he made a lot of references to government censorship, but looking at the petition, I don’t see anything in here that enables government censorship. Maybe he could address that for me.
Prof. Eric Goldman: Yeah, thank you for the opportunity. In fact, the entire structure of the petition is designed to attack the freedom that online services as publishers have in making their editorial decision making. So actually, it’s a really insidious way of advancing censorship by trying to attack their discretion.
Now, I think if Section 230 were changed in the way the petition described, there would be the First Amendment backdrop that would still provide some freedom of protection. But we would lose a lot in that process because of the fact that Section 230 in that circumstance actually provides more discretion than the First Amendment would require. And because of that, the petition’s designed to take back some of that discretion that we all enjoy today. That’s the gift that we’ve been given. First Amendment would already guarantee that baseline. Section 230 gives us the gift of all that extra editorial discretion that’s operated by service providers in order to help us advance our goals.
But really, my comment was really more about the other facets of our government’s operations that are engaging in censorship today. And if you aren’t aware of it, I apologize if you are, the Second Circuit has held that our President engaged in censorship in the way that he managed his Twitter account, literally said that our President was engaged in censorship. We should be fighting that. Now instead, our tax dollars are going to fund an appeal of that case to the U.S. Supreme Court. So we’re paying for our President to fight for the right to censor online. To me, that’s really screwed up.
Ashkhen Kazaryan: Can I add one quick thing? So President Trump’s lawyers actually have invoked Section 230 to protect him from liability for retweeting other people’s tweets, which is what Section 230 also does. That’s the whole comment that I wanted to say. And also, oh, yes, the First Amendment protects Twitter from Trump. It does not protect President Trump from Twitter.
Nick Marr: Okay. Jamie, I’ll go back to you for any closing remarks or any closing remarks that anyone else has to offer at this time before we close it out this afternoon.
Jamie Susskind: Sorry, were you going back to me? This is Jamie. Thank you, everybody, very much for participating in a heated but thoughtful conversation. And I appreciate, as I’m sure the FedSoc does, everybody’s candor here. We really appreciate everybody’s time. And thank you all for joining us. As Nick said, I think this will be recorded and made available in different venues, so hopefully you can share it with folks at your companies and other associations and organizations.
Nick Marr: Yes, thanks, Jamie. And on behalf of The Federalist Society, I want to thank all of our experts for the benefit of their valuable time and expertise today. We wish we had more time, could have gone on for hours. But thanks, all, for calling in today. And thank you for all the audience questions, the great audience participation. Apologies if you were left in the queue. But keep your eye on The Federalist Society website for upcoming teleforum calls. And this afternoon, we’re adjourned. Thank you.
Dean Reuter: Thank you for listening to this episode of Teleforum, a podcast of The Federalist Society’s Practice Groups. For more information about The Federalist Society, the practice groups, and to become a Federalist Society member, please visit our website at www.fedsoc.org.
Office of Sen. Marsha Blackburn
Professor of Law & Director of the Intellectual Property, Information & Communications Law Program
Michigan State University College of Law
Professor of Law and Co-Director of the High Tech Law Institute
Santa Clara University School of Law
Senior Fellow, Free Speech & Peace
Legislative Director for Senator Marsha Blackburn
Federalist Society’s Telecommunications & Electronic Media Practice Group