Oumou Ly: Welcome to The Breakdown. My name is Oumou. I’m a staff fel­low on the Berkan Klein Center’s Assembly Disnformation pro­gram. Our top­ic of dis­cus­sion today is CDA 230, sec­tion 230 of the Communications Decency Act, oth­er­wise known as the twenty-six words that cre­at­ed the Internet. Today I’m joined by Daphne Keller from the Stanford Cyber Policy Center. So, thank you for being with us today Daphne, I appre­ci­ate it. Especially in help­ing us unpack what has turned out to be such a huge and maybe con­se­quen­tial issue for the November elec­tion and cer­tain­ly for tech­nol­o­gy plat­forms and all of us who care and think about dis­in­for­ma­tion real­ly crit­i­cal­ly.

One of the first ques­tions I have for you is kind of a basic one. Can you tell us a lit­tle bit about CDA 230 and why it’s referred to as the twenty-six words that start­ed the Internet?

Daphne Keller: Sure. so first I strong­ly rec­om­mend Jeff Kosseff’s book, which coined that twenty-six words” phrase. It is a great his­to­ry of CDA 230 and it’s very nar­ra­tive, you know. It just sort of explains what was going on in the cas­es and what was going on in Congress. So, it’s not just some­thing for legal nerds and lawyers. It’s a real­ly use­ful ref­er­ence.

But maybe just to explain CDA 230’s role, I’ll pull back a lit­tle bit to the big pic­ture of inter­me­di­ary lia­bil­i­ty law in the US gen­er­al­ly. So inter­me­di­ary lia­bil­i­ty law is the law that tells plat­forms what legal respon­si­bil­i­ty they have for the speech and con­tent post­ed by their users. And US law falls into three buck­ets. There’s a big buck­et which is about copy­right, and there the line point is the Digital Millennium Copyright Act, the DMCA. And it has this very chore­o­graphed notice and take­down process. And through Harvard’s Lumen data­base actu­al­ly, there’s been just amaz­ing doc­u­men­ta­tion of how that process is abused, how much erro­neous over­re­moval it leads to, kind of just what hap­pens in that kind of sys­tem. That’s one big buck­et.

The oth­er big buck­et that does­n’t get a lot of atten­tion is fed­er­al crim­i­nal law. There’s no spe­cial immu­ni­ty for plat­forms for fed­er­al crim­i­nal law, and so if what you’re talk­ing about is things like child sex­u­al abuse mate­r­i­al, mate­r­i­al sup­port of terrorism…those things, the reg­u­lar law applies. There is no immu­ni­ty under CDA 230 or any­thing else.

And then the last big buck­et, the one we’re here to talk about today is CDA 230, which was enact­ed in 1996 as part of a big pack­age of leg­is­la­tion, some of which was sub­se­quent­ly struck down by the Supreme Court, leav­ing CDA 230 stand­ing as the law of the land. And it’s actu­al­ly a real­ly sim­ple law, even though it’s so wide­ly mis­un­der­stood that there’s now a Twitter account, Bad Section 230 Takes, just to retweet all the mis­rep­re­sen­ta­tions of it that come along.

But what it says is first, plat­forms are not liable for their users’ speech. Again, for the cat­e­go­ry of claims that are cov­ered. So this isn’t about ter­ror­ism, child sex abuse mate­r­i­al, etc. But for things like state law defama­tion claims, plat­forms are not liable for their users’ speech.

And the sec­ond thing it says is also, plat­forms are not liable for act­ing in good faith to mod­er­ate con­tent, so to enforce their own poli­cies against con­tent they con­sid­er objec­tion­able. And that sec­ond prob­lem was very much part of what Congress was try­ing to accom­plish with this law. They want­ed to make sure that plat­forms could adopt what we now think of as terms of ser­vice or com­mu­ni­ty guide­lines, and could enforce rules against hate­ful speech, or bul­ly­ing, or pornog­ra­phy, or just the the broad range of human behav­ior that most peo­ple don’t want to see on plat­forms.

And the key thing that Congress real­ized, because they had expe­ri­ence with a cou­ple of cas­es that hap­pened at the time, was that if you want plat­forms to mod­er­ate, you need to give them both of those immu­ni­ties. You can’t just say, You’re free to mod­er­ate, go do it,” you have to also say, And, if you under­take to mod­er­ate but you miss some­thing and there’s you know, defama­tion still on the plat­form or what­ev­er, the fact that you tried to mod­er­ate won’t be held against you.”

And this was real­ly impor­tant to Congress because there’d just been a case where a plat­form that tried to mod­er­ate was tagged as act­ing like an edi­tor or a pub­lish­er and there­fore fac­ing poten­tial lia­bil­i­ty.

So that’s the core of CDA 230, and I can talk more if it’s help­ful about sort of the things peo­ple get con­fused about like the wide­spread belief that plat­forms are some­how sup­posed to be neu­tral,” which is—

Ly: Well yeah! Would you please say a few words about that, yes.

Keller: Yeah. So, I mean Congress had this inten­tion to get plat­forms to mod­er­ate. They did not want them to be neu­tral, they want­ed the oppo­site.

Ly: Right. Exactly right.

Keller: Yeah. But I think a lot of peo­ple find it intu­itive to say well it must be that plat­forms have to be neu­tral. And I think that intu­ition comes from a pre-Internet media envi­ron­ment, where kind of every­thing was either a com­mon car­ri­er like a tele­phone just inter­con­nect­ing every­thing and let­ting every­thing flow freely, or it was like NBC News or The New York Times. It was heav­i­ly edit­ed, and the edi­tor clear­ly was respon­si­ble for every­thing that the reporters put in there. And those two mod­els kind of don’t work for the Internet. If we still had just those two mod­els today we would still have only a very tiny num­ber of elites with access to the micro­phone. And every­body else would still not have the abil­i­ty to broad­cast our voic­es on things like Twitter or YouTube, or what­ev­er that we have today.

And I think that’s not what any­body wants. What peo­ple gen­er­al­ly want is they do want to be able to speak on the Internet with­out plat­form lawyers check­ing every­thing they say before it goes live. We want that. And we also gen­er­al­ly also want plat­forms to mod­er­ate. We want them to take down offen­sive or obnox­ious or hate­ful or dan­ger­ous but legal speech. And so 230 is the law that allows both of those things to hap­pen at once.

Ly: Daphne, can you talk a lit­tle bit about the two dif­fer­ent types of immu­ni­ty that are out­lined under CDA 230; we call them short­hand (c)(1) and (c)(2)?

Keller: Sure. So, in the super short­hand, (c)(1) is immu­ni­ty for leav­ing con­tent up, and (c)(2) is immu­ni­ty for tak­ing con­tents down. So, most of the lit­i­ga­tion that we’ve seen his­tor­i­cal­ly under the CDA is about (c)(1). Its cases—often you know, real­ly dis­turb­ing cas­es where some­thing ter­ri­ble hap­pened to some­one on the Internet, and speech defam­ing them was left up or speech threat­en­ing them was left up, or they con­tin­ued to face things that were ille­gal. So those are case about (c)(1). If the plat­form leaves that stuff up, are they liable?

The sec­ond prong, (c)(2), just has­n’t had near­ly as much atten­tion over the years until now. But that’s the one that says plat­forms can choose their own con­tent mod­er­a­tion pol­i­cy. That they’re not liable for choos­ing to take down con­tent they deem objec­tion­able” as long as they are act­ing in good faith.” And that’s the prob­lem, it does have this good faith require­ment. And part of what the exec­u­tive order is try­ing to do is say, Oh, well you have to meet the good faith require­ment to get any of the immu­ni­ties,” you know. If some­one can show that you are not act­ing in good faith, then you lose this much more eco­nom­i­cal­ly con­se­quen­tial immu­ni­ty under (c)(1) for con­tent that’s on your plat­form that’s ille­gal.

And sort of the biggest con­cern I think for many peo­ple there is, if this eco­nom­i­cal­ly essen­tial immu­ni­ty is depen­dent on some gov­ern­ment agency deter­min­ing whether you act­ed in good faith, that intro­duces just a ton of room for pol­i­tics, because my idea of what’s good faith won’t be your idea of what’s good faith won’t be Attorney General Barr’s idea of what’s good faith. And so hav­ing some­thing where polit­i­cal appointees in par­tic­u­lar get to decide what con­sti­tutes good faith and then all of your immu­ni­ties hang in the bal­ance? is real­ly fright­en­ing for com­pa­nies.

And inter­est­ing­ly, today we see Republicans call­ing for a fair­ness doc­trine for the Internet,” call­ing for a require­ment of good faith or fair­ness in con­tent mod­er­a­tion, but for a gen­er­a­tion it was you know, lit­er­al­ly part of the GOP plat­form every year to oppose the fair­ness doc­trine that was enforced for broad­cast by the FCC. You know, President Reagan said it was uncon­sti­tu­tion­al. This was just like a core con­ser­v­a­tive cri­tique of big gov­ern­ment sup­press­ing speech for…decades, and now it has become their cri­tique, and they’re ask­ing for state reg­u­la­tion of plat­forms.

Ly: That is so inter­est­ing to me. Both that and the fact that you know…CDA 230 in so many ways is what allows Donald Trump’s Twitter account to stay up. So it’s real­ly real­ly inter­est­ing that the GOP has decid­ed to rail against it.

Keller: It’s fas­ci­nat­ing.

Ly: So just recent­ly the pres­i­dent signed an exec­u­tive order con­cern­ing CDA 230 pret­ty direct­ly. There was sort of an episode on social media where the pres­i­dent sent out a tweet, it was then labeled by Twitter—fact-checked in way. Can you talk a lit­tle bit about what the exec­u­tive order does?

Keller: Sure. So I think… I want­ed to start at a super high lev­el with the exec­u­tive order… In the day or so after it came out, I had mul­ti­ple peo­ple from around the world reach out to me and be like, This is like what hap­pened in Venezuela when Chavez start­ed shut­ting down the radio sta­tions.” You know, it’s just sort of…it has this res­i­dence of like—

Ly: It has that feel. Yeah.

Keller: —there’s a polit­i­cal leader try­ing to pun­ish speech plat­forms for their edi­to­r­i­al poli­cies. And that— You know, before you even get into the weeds, that high-level impact of it is real­ly impor­tant to pay atten­tion to. And that is the rea­son why CDT, the Center for Democracy and Technology in DC, has filed a First Amendment case say­ing this whole thing just can’t stand and…we’ll see what hap­pens with that case.

So then there are also in the exec­u­tive order four oth­er things that might be big deals. So what is that DOJ is instruct­ed to draft leg­is­la­tion to change 230. So, even­tu­al­ly that will come along and pre­sum­ably it will track the very long list of ideas that are in the DOJ report that came out this week.

A sec­ond is it instructs fed­er­al agen­cies to inter­pret 230 in the way that the exec­u­tive order does, this way that I think is not sup­port­ed by the statute, that takes the good faith” require­ment and kinda…applies it in places it’s not writ­ten in the statute. Nobody’s quite sure what that means. Because there just aren’t that many sit­u­a­tions where fed­er­al agen­cies…care? about 230, but we’ll see what comes out of that.

A third is that Attorney General Barr of the DOJ is sup­posed to con­vene state attor­neys gen­er­al to look at a long list of com­plaints. And this is like, if you look at it if you’re an Internet pol­i­cy nerd it’s just all the hot-button issues. Sort of like, are fact-checkers biased? Can algo­rith­mic mod­er­a­tion be biased and—well, it can; how can you reg­u­late that? You know, you will rec­og­nize these things if you look at the list.

And then the fourth one, and and this is one that I think deserves a lot of atten­tion, is that DOJ is sup­posed to review whether plat­forms—par­tic­u­lar plat­forms are prob­lem­at­ic vehi­cles for gov­ern­ment speech due to view­point dis­crim­i­na­tion,” and then based on that look into whether they can car­ry federally-funded ads. I think for most plat­forms the ad dol­lars part is not that big a deal, but being on a fed­er­al gov­ern­ment block list of you know, plat­forms with dis­ap­proved edi­to­r­i­al policies…just like, has this McCarthyist feel.

Ly: Can you talk a lit­tle bit about the role of CDA in rela­tion to the busi­ness mod­els that the plat­forms run?

Keller: Sure. So, broad­ly speak­ing, the Internet could not exist the way we know it with­out some­thing like CDA 230. And that’s not just about the Facebooks of the world, that’s about every­thing all up and down the tech­ni­cal stack. You know, DNS providers. Cloudflare. Amazon Web Services and oth­er back­end web host­ing. And also tons of lit­tle com­pa­nies, you know. The knit­ting blog that per­mits com­ments, or the farm equip­ment sell­er that has user feed­back. All of those are pos­si­ble because of CDA 230. And if you pull CDA 230 out of the pic­ture, it’s just very hard to imag­ine the coun­ter­fac­tu­al of how American Internet tech­nol­o­gy and com­pa­nies would’ve evolved. They would’ve evolved some­how, you know. And pre­sum­ably the coun­ter­fac­tu­al is we would have some­thing like what the EU has, which boils down to a notice and take­down mod­el for every kind of legal claim? But they… You know, they bare­ly have an Internet econ­o­my for these kinds of com­pa­nies. There’s a rea­son that things devel­oped the way that they did.

Ly: Yeah. Do you think that there’s any… Maybe not what you think, because I’m sure that we can all agree this is like­ly to be the case. If the lia­bil­i­ty [?] that 230 offers plat­forms is removed, how would that change the way that plat­form approach con­tent mod­er­a­tion?

Keller: Well, I think a lot of lit­tle com­pa­nies would just get out of the busi­ness entire­ly. And so there’s an advo­ca­cy group in DC called Engine, which rep­re­sents star­tups and small com­pa­nies. And they put togeth­er a real­ly inter­est­ing two-pager on the actu­al cost of defend­ing even friv­o­lous claims in a world with CDA 230 and in a world with­out CDA 230, and it’s basi­cal­ly you know, you’re look­ing at 10 to $30,000 in the best-case sce­nario for a case that goes away very very quick­ly, even now. And that’s not a cost that small com­pa­nies want to incur. And investors…you know, there are all these sur­veys of investors say­ing, I don’t want to invest in new plat­forms to chal­lenge today’s incum­bents if they’re in a state of legal uncer­tain­ty where they could be liable for some­thing at any time.”

So I think you just elim­i­nate a big swath of the parts of… Of both the exist­ing parts of the Internet that pol­i­cy­mak­ers don’t pay any atten­tion to. Like, make the very very vul­ner­a­ble and some of them go away, and that’s trou­bling. And you cre­ate a lot of prob­lems for any new­com­ers who would actu­al­ly chal­lenge today’s incum­bents and try to rival them in seri­ous user-generated con­tent host­ing ser­vices.

For the big plat­forms, you know for Facebook, for YouTube…they’ll sur­vive some­how. You know, they change their busi­ness mod­el. They prob­a­bly… The eas­i­est thing to do is to use our terms of ser­vice to pro­hib­it a whole lot more, and then just like take down a huge swath so you’re not fac­ing much legal risk.

Ly: It’s hard to imag­ine liv­ing in that kind of a world.

Keller: It is. It is.

Ly: Yeah. Thank you so much for join­ing me today, Daphne. This was a great and enlight­en­ing con­ver­sa­tion and I’m sure our view­ers will enjoy it.

Keller: Thank you for hav­ing me.

Further Reference

Medium post for this episode, with intro­duc­tion and edit­ed text


Help Support Open Transcripts

If you found this useful or interesting, please consider supporting the project monthly at Patreon or once via Cash App, or even just sharing the link. Thanks.