Joan Donovan: Hey, every­body. We are at the top of the hour here, and so it’s real­ly excit­ing to be able to talk with this com­mu­ni­ty, although I really…you know, miss the yel­low house here. It always is such an excit­ing expe­ri­ence to be able to you know, hang out, shoot the shit and like, get into these issues in a real way. I feel quite a bit of dis­tance from my com­mu­ni­ties of researchers, and because of the pan­dem­ic. But luck­i­ly I’ve been able to work with an incred­i­ble team at Shorenstein with a lit­tle bit of crossover at Berkman in order to sort of sus­tain my intel­lec­tu­al life. 

And one of the things that we’ve been spend­ing I would say the last two years on—my life, but dif­fer­ent folks have been engaged with it for dif­fer­ent amounts of time—is this media manip­u­la­tion case­book. And what we’re real­ly try­ing to do is present a theory/methods pack­age for study­ing media manip­u­la­tion and dis­in­for­ma­tion cam­paigns. And over the last sev­er­al years I’ve been real­ly engaged in this spe­cif­ic top­ic because ulti­mate­ly I think the ques­tion of media manip­u­la­tion and dis­in­for­ma­tion for me real­ly is about do we have a right to the truth. Do we have a right to accu­rate infor­ma­tion. And as we’ve watched the last…probably two decades of tech­nol­o­gy devel­op we’ve seen the fore­clo­sure of some of our oth­er trust­ed insti­tu­tions where we’ve seen you know, the dry­ing up of local jour­nal­ism; we’ve seen uni­ver­si­ties come to rely on Google ser­vices for most of their infra­struc­ture; we’ve start­ed to see social media kind of take hold of the pub­lic imag­i­na­tion. And none of these things are real­ly think­ing a lot… None of these things are real­ly designed, I should say, to deal with what we’re going through in this moment, which is pro­found iso­la­tion, cou­pled with immense, and war­rant­ed, para­noia about our polit­i­cal sys­tem, and eco­nom­ic col­lapse. And you know, I spend a lot of time in my room, just like the rest of you, try­ing to fig­ure out some of these big questions. 

And so today I real­ly want to talk about, well what is the price that we’re real­ly pay­ing for unchecked mis­in­for­ma­tion; about the way in which our media ecosys­tem, to use a turn of phrase from Berkman Klein’s illus­tri­ous his­to­ry, what are we doing here with this media ecosys­tem in the midst of a pan­dem­ic, know­ing that the cost is very high if peo­ple get med­ical mis­in­for­ma­tion before they get access to time­ly, local, rel­e­vant, and accu­rate information. 

I’m gonna share my screen; got some real­ly great graph­ics that we devel­oped for the Media Manipulation Casebook at media​ma​nip​u​la​tion​.org. And these were devel­oped by Jebb Riley, which is a amaz­ing illus­tra­tor, and he’s been help­ing with our team over the past few months sort of get our stuff together.

So this is who I am. I am right now the Research Director and the Director of the Technology and Social Change Project at the Shorenstein Center. But appar­ent­ly the Shorenstein Center is just like, my house at this point. It’s real­ly hard to think about what a return to the uni­ver­si­ty’s going to feel like, giv­en the fact I’ve spent sev­er­al months work­ing from home. But today I’m going to present on The True Costs of Misinformation: Producing Moral and Technical Order in a Time of Pandemonium. And I choose that word real­ly inten­tion­al­ly and I’ll tell you why.

But first a few key def­i­n­i­tions. So what am I talkin’ about when we’re talkin’ about media manip­u­la­tion and dis­in­for­ma­tion online? We’re talk­ing about basic Comms 101. Media is an arti­fact of com­mu­ni­ca­tion. So you know, notes and books and you know, any kind of thing…memes, what have you, arti­facts, things that are the leave-behinds of pub­lic con­ver­sa­tion for the most part is what we study. But when I talk about news or news out­lets I will talk about news and news out­lets using that ter­mi­nol­o­gy. But when I say media, I’m refer­ring in a way to this kind ephemera. 

Manipulation. My team has debat­ed quite a bit, and if Gabby Lim is lis­ten­ing, she’s been at the fore­front of us try­ing to get our def­i­n­i­tions in order. Manipulation to us is to change by art­ful or unfair means so as to serve one’s pur­pose.” And we leave the term art­ful” in there because as I was doing a his­to­ry of media manip­u­la­tion on the Web, I was drawn into the work of The Yes Men, who are media activists who real­ly you know, fig­ured out ear­ly on that who can own a domain? And appar­ent­ly any­body can. Because they bought the domain asso­ci­at­ed with the WTO, they bought a domain asso­ci­at­ed with George Bush. And instead of real­ly lam­bast­ing these groups, what they did is they imper­son­at­ed them in order to draw in oth­er folks. And there’s been sev­er­al doc­u­men­taries made about The Yes Men and their media manip­u­la­tion hoax­es. But for them, the point isn’t to keep the hoax going. The point is to reveal some­thing about the nature of pow­er. And so manip­u­la­tion in that case, for me, is a sort of an art­ful hack. And we can talk a bit about white hat hack­ing and gray hack­ing if time allows. 

But when we talk about some­thing as dis­in­for­ma­tion we actu­al­ly try to apply a very strict set of cri­te­ria. And we define it as the cre­ation and dis­tri­b­u­tion of inten­tion­al­ly false infor­ma­tion for polit­i­cal ends.” And intentionality…you know, for any­body on the line here that is a lawyer you are just cring­ing right now. I can kin­da feel it through the webi­nar. And don’t wor­ry, I’m not going to talk about the freeze peach that just you know, sets your hair on fire. But inten­tion is hard. I’m not gonna lie. Who can know a man’s heart, right? 

But the issue here about inten­tion­al­i­ty, usu­al­ly with dis­in­for­ma­tion cam­paigns if they are to set off a cas­cade of mis­in­for­ma­tion, manip­u­lat­ing algo­rithms in par­tic­u­lar, gen­er­al­ly these groups have to recruit from more open spaces online. And so we’ve seen you know, dif­fer­ent for…you know, Reddit groups for instance be repur­posed to talk about the inten­tion of what it means to spread dis­in­for­ma­tion and how to get one over on cer­tain jour­nal­ists. And so we are able to dis­cov­er inten­tion when we can dis­cov­er where a media manip­u­la­tion cam­paign is being planned. But it’s a real­ly hard thing to assess with­out direct evi­dence. But nev­er­the­less when we talk about dis­in­for­ma­tion it’s because we have some direct evi­dence that points us to the inten­tion of the campaign. 

So, I want to rec­og­nize that we’re in a moment of extreme emo­tion­al deprivation…you know, social iso­la­tion. And this word the pan­dem­ic” is some­thing I was real­ly drawn to think­ing about it in its ety­mol­o­gy, of think­ing about of all the peo­ple, pub­lic, com­mon, (of dis­ease) wide­spread. So pan” and demos” togeth­er, think­ing about those kind of ills that spread in these kinds of sit­u­a­tions that are in some respects com­plete­ly unpre­dictable and hope­ful­ly at least for my life­time, once in a lifetime. 

But I pre­fer to call this moment pan­de­mo­ni­um.” And I’ll tell you why. So pan” mean­ing all but also… I mean demos” mean­ing all, but pan” mean­ing evil spir­it, evil, divine pow­er, infe­ri­or divine being. And the rea­son why pan­de­mo­ni­um for me is a bet­ter descrip­tor of this moment is I think back to right at the begin­ning, when most peo­ple were like, Okay, we can han­dle this. We’ll get through this. This isn’t going to be a prob­lem. We’re just gonna— You know what, every­body go home from school. We’re just gonna get on Zoom.”

And it cre­at­ed quite a chaot­ic envi­ron­ment. When we were think­ing about how do we write about the phe­nom­e­non of Zoom bomb­ing, my coau­thors Brian Friedberg and Gabrielle Lim, we were talk­ing a lot about well where does this oppor­tu­ni­ty arise from where you have a bunch of peo­ple adopt­ing a brand-new tech­nol­o­gy very quick­ly, you have insti­tu­tions buy­ing into it at mas­sive vol­ume, and then you have stu­dents who are not real­ly bought in to want­i­ng to be on Zoom all day. 

And so, some of the ear­ly instances of Zoom bomb­ing that we saw were not what it end­ed up being, which turned into a kind of polit­i­cal and ide­o­log­i­cal war of racism and all kinds of oth­er pho­bias. But at the begin­ning it was a lot of prank­ing. Students drop­ping links to their class­es into chat apps say­ing, Hey you know, we’re gonna be talkin’ about this thing, like, why don’t you come in and pre­tend to be a stu­dent.” And stu­dents were using their phones to video tape them­selves invad­ing class­rooms and then upload­ing them to YouTube. 

And it was pret­ty fun­ny, I’m not gonna lie. My favorite one was this video of a stu­dent just inter­rupt­ing the pro­fes­sor and say­ing, Hey, can I just pay you for an A? My oth­er pro­fes­sors are let­tin’ me pay them. I’ll give you three grand, we’ll call it a day.” And the pro­fes­sor’s just like, what is going on? Are you even in this class, right. And it’s just kind of jokey and hoaxy. But it was a way of peo­ple cop­ing with the kind of moment, try­ing to assess what was going on. 

And then you saw more vicious use cas­es of Zoom bomb­ing, where black women, black women pro­fes­sors were being tar­get­ed. You had instances where LGBT groups were being tar­get­ed, Alcoholics Anonymous… And it went from being prank­ing and hoax­ing into some­thing much more…well, the only word I can describe is disgusting. 

And what it prompt­ed, though, was a rapid change in the tech­nol­o­gy itself. Zoom did­n’t just change their set­tings, but they real­ly had to inter­ro­gate the entire sys­tem, even think­ing through where their servers are based and what kind of pri­va­cy pro­tec­tions they would need to put into place. But what’s inter­est­ing about that is because Zoom had a closed business-to-business mod­el and it was­n’t nec­es­sar­i­ly like social media is, just out in the world for all to use, they were able to install these changes with­out an immense amount of blow­back from the pub­lic. But when we see social media com­pa­nies try to deal with some of the more ter­ri­ble use cases…racist use cas­es, trans­pho­bic use cas­es, things grind to a halt. And we’ve seen over the course of this last sum­mer even instances where social media is try­ing to cleanup med­ical mis­in­for­ma­tion in order to pre­vent poi­son­ing as well as peo­ple tak­ing unnec­es­sary risks. There’s been push­back against that as well. 

And so it real­ly has a lot to do with who the cus­tomers are, and who the tech­nol­o­gy com­pa­ny thinks they’re serv­ing in terms of how they envi­sion what is pos­si­ble for the design of their sys­tems. And in moments of pan­de­mo­ni­um, or as Foucault might call it epis­te­mo­log­i­cal rup­tures” or par­a­digm shift, we see tech­nol­o­gy become much more flex­i­ble and mal­leable to the sit­u­a­tion than they may have been in oth­er sit­u­a­tions that did­n’t feel as crit­i­cal as they do right now. 

And so what you’re liv­ing through in this moment is a real­ly rapid suc­ces­sion of tech­no­log­i­cal changes that many of us are just you know, every day wak­ing up to and being like, Wait, what hap­pened? They did what? Didn’t they say they were going to do this oth­er thing?” You know, and so as we as a research team try to reck­on with this, we also have to think about well like, method­olog­i­cal­ly how do we cap­ture this, and styl­is­ti­cal­ly, the­o­ret­i­cal­ly, how do we know what to look for? 

And so, I always turn back to the work of Chris Kelty, who was my post-doc super­vi­sor and some­one who is an anthro­pol­o­gist and infor­ma­tion stud­ies schol­ar at UCLA. And he wrote this book in 2009 called Two Bits, and it’s real­ly also indebt­ed to the work of Elinor Ostrom on gov­ern­ing the com­mons. But he talks a lot about how to pro­duce moral and tech­ni­cal order. And he is study­ing free soft­ware. And so think­ing with his frame­work, I’m real­ly drawn in by this quote:

Geeks fash­ion togeth­er both technology—and prin­ci­pal­ly soft­ware, hard­ware, net­works, and protocols—an imag­i­na­tion of the prop­er order of col­lec­tive polit­i­cal and com­mer­cial action, that is, of how econ­o­my and soci­ety should be ordered collectively.
Chris Kelty, Two Bits: The Cultural Significance of Free Software

And what he’s real­ly try­ing to say here is that the way in which our tech­nol­o­gy is built encodes a vision of soci­ety and the econ­o­my. And in build­ing soft­ware in this way, we end up, recur­sive­ly, with a soci­ety that in some ways mir­rors that tech­nol­o­gy, but in oth­er ways the tech­nol­o­gy real­ly is dis­tort­ed by the con­di­tions of its production. 

And so think­ing through that I wrote this paper with Anthony Nadler, Matt Crain, about Weaponizing the Digital Influence Machine: The Political Perils of Online Ad Tech. This was like a year and a half ago or so. And we came up with this con­cept of the dig­i­tal influ­ence machine, which is the infra­struc­ture of data col­lec­tion and tar­get­ing capac­i­ties devel­oped by ad plat­forms, web pub­lish­ers and oth­er inter­me­di­aries, which includes con­sumer mon­i­tor­ing, audience-targeting, and auto­mat­ed tech­nolo­gies that enhance its reach and, ulti­mate­ly its pow­er to influ­ence.

And so instead of think­ing about just social media, we’re think­ing about the archi­tec­ture of adver­tis­ing that spreads across the Web and social media as a way to under­stand well how is the Web or the Internet reflect­ing back then its vision of soci­ety. And how is that infra­struc­ture specif­i­cal­ly about the reci­procity between data col­lec­tion and then cir­cu­la­tion of infor­ma­tion through tar­get­ing, how is that encod­ed and what forms of pow­er are then able to lever­age that dig­i­tal influ­ence machine in order to produce…let’s just call it social change? 

But that pow­er is some­thing that soci­ol­o­gists, comms schol­ars, every one of us I think on this webi­nar today are real­ly inter­est­ed in under­stand­ing. Because I’m not mak­ing the case that every instance of mis­use of tech­nol­o­gy is the fault of some com­pa­ny. What we’re actu­al­ly try­ing to under­stand is as this tech­nol­o­gy scales, as it devel­ops, what kind of social imag­i­na­tion is ani­mat­ing design deci­sions? And who can either pur­chase pow­er in the sys­tem, or wield it by virtue of hav­ing very large social networks? 

And so we’re not nec­es­sar­i­ly inter­est­ed in all mis­in­for­ma­tion, or all instances of bad behav­ior online. What we’re inter­est­ed in is how do cer­tain kinds of behav­ior scale, how do peo­ple learn about it, how do they adopt that kind of pow­er, and how do they wield it against a soci­ety that is using the Internet by and large for enter­tain­ment, using it to learn about things, using it to read the news, using it for their edu­ca­tion, right. A lot of things are now pass­ing through the Internet as a kind of an oblig­a­tory pas­sage point. But in doing so, in dig­i­tiz­ing most of our lives and now even most of our every­day lives dur­ing the pan­dem­ic, what kind of dif­fer­ences in pow­er are man­i­fest­ing them­selves, and to what ends then are we as a col­lec­tive asked to shoul­der the bur­den, or to pay the price for the pro­duc­tion of this par­tic­u­lar kind of moral and tech­ni­cal order. 

So I asked myself, if we’re in this sit­u­a­tion and it’s…now I would say eas­i­er than ever to con­duct pro­pa­gan­da cam­paigns, to hoax the pub­lic, to per­form dif­fer­ent kinds of grifts, I’m drawn into think­ing about how at the begin­ning of the pan­dem­ic there were were over a hun­dred thou­sand new domains reg­is­tered with COVID-19” or coro­n­avirus” as part of the domain address, part of the URL. So, what ways in which…are we pay­ing for this kind of media ecosys­tem infor­ma­tion envi­ron­ment that ulti­mate­ly does­n’t seem to be serv­ing our broad­est pub­lic inter­est, which for me is at this stage at least with the pan­dem­ic, being able to access accu­rate information. 

So I’m think­ing a bit through the lens of Siva Vaidhyanathan’s book Antisocial Media, think­ing about well who pays for social media? And we know the lit­tle adage of course is if the prod­uct is free, the prod­uct is you. But the prod­uct actu­al­ly isn’t free. Advertisers are the ones that pay for it, and then you are the con­sumer of adver­tis­ing through social media. 

But Zuckerberg said this real­ly inter­est­ing thing. He said, I don’t think it’s right for a pri­vate com­pa­ny to cen­sor politi­cians in a democ­ra­cy. So this is dur­ing his Georgetown speech. And I thought yeah, I can agree with that. And like, pri­vate com­pa­nies should­n’t be cen­sor­ing politi­cians in a democ­ra­cy. Cool. 

But also, like, this seems real­ly like a plat­i­tude. It does­n’t seem… It does­n’t— It res­onates, but it doesn’t…it just hits dif­fer­ent when you start to think about well what do they mean by cen­sor­ing politi­cians. And what do you do when you cre­ate the con­di­tions by which politi­cians can—or any ol’ per­son can speak to mil­lions at scale? What hap­pens when you are not nec­es­sar­i­ly account­ing for the fact that you have built a broad­cast tech­nol­o­gy that is allow­ing for mis­in­for­ma­tion at scale? 

And so that was my ini­tial thoughts on this. And one of the things that I was drawn into in ear­ly January 2020 was a bit of a rever­sal in Facebook pol­i­cy, where they write, In the absence of reg­u­la­tion, Facebook and oth­er com­pa­nies are left to design their own poli­cies. We have based ours on the prin­ci­ple that peo­ple should be able to hear from those who wish to lead them, warts and all… And that kind of state­ment, warts and all,” made me won­der a bit about well how are they real­ly gonna reck­on with the way in which dif­fer­ent politi­cians are using their sys­tem, not just the organic”—quote, unquote—we could get into all kinds of dis­cus­sions about why that metaphor is wrong. But what is it they’re actu­al­ly try­ing to get at when they say warts and all” when it comes to politi­cians who are using both their adver­tis­ing sys­tems and oth­er forms of social media mar­ket­ing to essen­tial­ly delude the pub­lic, right. To not just put for­ward a polit­i­cal posi­tion, but also to gin up all kinds of sus­pi­cion… And of course this is before the pan­dem­ic real­ly takes root. But you know, this seemed to be their reaction—Facebook’s reac­tion to the sit­u­a­tion that we were in was basi­cal­ly like, Well, if you don’t reg­u­late us, we’re just gonna you know, kin­da have to let it have to happen.”

And then the pan­dem­ic hits, and Facebook real­izes that they’ve become this cen­tral fig­ure in not only med­ical mis­in­for­ma­tion cam­paigns but also in this effort to, as Yochai Benkler and Rob Faris’ group have shown, to make peo­ple believe that mail-in vot­ing is insane­ly corrupt. 

And so even­tu­al­ly Facebook does have to change their poli­cies on polit­i­cal adver­tis­ing. Because they real­ize that at scale, it’s dif­fer­ent. Scale is dif­fer­ent. You know, Clay Shirky often talks about more is dif­fer­ent, right. And in soci­ol­o­gy we don’t under­stand our­selves as psy­chol­o­gists because we know that more is dif­fer­ent, soci­ety’s actu­al­ly dif­fer­ent. And so when you’re deal­ing with mis­in­for­ma­tion at scale, peo­ple who pay the price don’t tend to be the com­pa­nies at all, but real­ly end up being the peo­ple who are infor­ma­tion con­sumers, let’s say. 

So how do we study this? How do we study mis­in­for­ma­tion at scale? How do we make sense of it? Our new web site is up now. And what we do is we put togeth­er a the­o­ry about the media manip­u­la­tion life cycle, which if you want to study these things we rec­om­mend that you look for essen­tial­ly five points of action: where is the manip­u­la­tion cam­paign being planned; what are its ori­gins; how does it spread across social plat­forms and the Web; what are the respons­es by indus­try activists, politi­cians, and journalists. 

This is cru­cial. If nobody responds to your mis­in­for­ma­tion cam­paign, not much hap­pens. You know, 2016, 2017, even ear­li­er than that, Whitney Phillips’ work has point­ed us to the fact that this kind of media hoax­ing per­formed by you know, trolls and oth­er folks through 4chan, it had become a bit of a game to get jour­nal­ists to say wrong things, to try to get politi­cians or oth­er folks to chime in, to trig­ger the libs.” 

And so when we’re think­ing about how a media manip­u­la­tion cam­paign is gonna suc­ceed, we’re actu­al­ly try­ing to under­stand well, who’s going to respond. Because you know, there’s going to be so many…let’s just call them shenani­gans online that it would be impos­si­ble to mod­er­ate that at scale. 

Then we looked close­ly at mit­i­ga­tions. So 2016, 2017, real­ly the big tools of social media com­pa­nies were con­tent moderation—including take-downs, and the removal of cer­tain accounts. They weren’t real­ly in the busi­ness 2016, 2017, of demot­ing con­tent. There was some removal of mon­e­ti­za­tion. But over the last year in par­tic­u­lar we’ve seen a num­ber of dif­fer­ent ways in which plat­form com­pa­nies are will­ing to do mod­er­a­tion and cura­tion of con­tent online. And so Stage Four has expand­ed. But, with­out any trans­paren­cy. And so we’re often back­ing into prob­lems of con­tent mod­er­a­tion as we’re try­ing to under­stand the scale and the scope of a media manip­u­la­tion campaign—we often run into mod­er­a­tion that has not been record­ed in any pub­lic kind of way. 

And then the last thing we look at is the adjust­ments by the manip­u­la­tors to the new infor­ma­tion envi­ron­ment. So if some action is tak­en, or if they enjoyed some suc­cess, we will see that cam­paign hap­pen again and again and again. 

So if we can answer the ques­tion around you know, who pays for social media, and we know that adver­tis­ers are pump­ing mon­ey into it and we know that you know, the pub­lic is by and large the crop that is being har­vest­ed, we have to think about this cat­e­go­ry of mis­in­for­ma­tion then a lit­tle bit dif­fer­ent­ly. We have to think about well who is actu­al­ly called into ser­vice to mit­i­gate mis­in­for­ma­tion at scale. I won’t make you suf­fer through the open hear­ing that I was part of but it is on YouTube. Not iron­i­cal­ly, that is where they air the Select Committee on Intelligence hear­ings. And we did a hear­ing on mis­in­for­ma­tion, con­spir­a­cy the­o­ries, and info­demics. And I’m just gonna talk a lit­tle bit about what I pre­sent­ed in that hear­ing and then how I think that it relat­ed to the Media Manipulation Casebook. 

So, when we’re think­ing about who pays for mis­in­for­ma­tion, I’m real­ly think­ing about sev­er­al cat­e­gories of folks that are pro­fes­sion­als that have now start­ed to build their liveli­hoods and careers around han­dling mis­in­for­ma­tion at scale. So journalists…we know the sto­ry, many of us are prob­a­bly involved in this, where our role is one of find­ing mis­in­for­ma­tion online, or dis­in­for­ma­tion cam­paigns, and then ring­ing the alarm bell and try­ing to get plat­form com­pa­nies to go on record. This is a new beat for jour­nal­ists. Over the last sev­er­al years jour­nal­ists have real­ly honed skills that are unlike any oth­er, using open source intel­li­gence as well as using oth­er forms of online inves­ti­ga­tion, even dig­i­tal ethnog­ra­phy, to get to the bot­tom of mis­in­for­ma­tion campaigns. 

Public health pro­fes­sion­als. I’ve talked to more pub­lic health pro­fes­sion­als than I have in the span of my life­time over the last few months, where they are just…del­uged by the kinds of…by peo­ple who are scared, peo­ple who are con­fused, and peo­ple who want to know more about COVID-19 because they saw online you know, that 5G’s caus­ing coro­n­avirus there­fore no vac­cine will work. They saw online that Bill Gates actu­al­ly unleashed this on the world and so we need to put Bill Gates in jail and the rest of this will end. Public health pro­fes­sion­als have been called in to do quite a bit of mis­in­for­ma­tion wrangling. 

Civil soci­ety lead­ers… I’ve been work­ing with a group of folks through the Disinformation Defense League for the past sev­er­al months, real­ly focused on get out the vote cam­paigns, right. If your get out the vote cam­paign isn’t just about let­ting peo­ple know when, where, and how to vote but real­ly about try­ing to get them to under­stand that vot­ing is not imper­iled, your mail-in vot­ing is not imper­iled, or oth­er forms of vot­ing, or that the machines aren’t rigged, this is the role that civ­il soci­ety has had to take on as a result of mis­in­for­ma­tion at scale. 

And then last­ly, law enforce­ment, pub­lic ser­vice elec­tion offi­cials, prob­a­bly best exem­pli­fied by the recent con­tro­ver­sy around the web site oper­at­ed by CISA which Chris Krebs was at the helm of until he was fired via Twitter. You know, these are the kind of folks that are tak­ing these ques­tions, field­ing these ques­tions about vot­er fraud and mis­in­for­ma­tion. And so, they don’t have bud­gets for that. Nobody was like, Oh yeah, you know what, we also need to have a huge mis­in­for­ma­tion bud­get so that elec­tion offi­cials can let peo­ple know that their votes were count­ed and that this is the way in which the machines that they used worked, or this is how we ver­i­fied sig­na­tures.” Right, we just… I antic­i­pat­ed it, but for the most part we did­n’t antic­i­pate the scale at which this would be occurring. 

And so, when we talk about this on our web site—and we have four­teen or so case stud­ies up there, we have about anoth­er nine or so that’s gonna go up by end of year, one case in par­tic­u­lar around the Ukraine whistle­blow­er, we were look­ing at the dif­fer­ent kin­da of media manip­u­la­tion tac­tics that were used. And jour­nal­ists by and large had to real­ly nav­i­gate a very twist­ed ter­rain of try­ing to cov­er the Ukraine sto­ry with­out say­ing the name of the whistle­blow­er, which was trav­el­ing in very high vol­umes through­out the right-wing media ecosys­tem. And there were a lot of attempts to try to get cen­ter and left media to say this per­son­’s name. And for the most part they relent­ed, and they did­n’t cov­er the dis­in­for­ma­tion cam­paign, because of the role and the sta­tus that whistle­blow­ers his­tor­i­cal­ly play in our soci­ety, which is that we should be pro­tect­ing their anonymity. 

As well, the Plandemic doc­u­men­tary, the way that that was plant­ed online, they knew it was gonna vio­late the terms of ser­vice on every plat­form so they set up a web site in which you could engage with the con­tent, watch the con­tent, but then also down­load it. And they had a set of instruc­tions that basi­cal­ly said Download this and reu­pload it to your own social media.” And this hap­pened thou­sands and thou­sands of times, so it was real­ly dif­fi­cult to actu­al­ly get that video removed from the Internet and removed from let’s say, promi­nent plat­forms, because it’s still on the Internet. But this tac­tic of dis­trib­uted ampli­fi­ca­tion, we’ve seen this before. But we don’t have a lot of great solu­tions to it when it comes to deal­ing with med­ical mis­in­for­ma­tion in particular.

As well, when we’re think­ing about viral slo­gans and the ways in which civ­il soci­ety have had to deal with white suprema­cist and extrem­ist speech online, we have a case study about the slo­gan It’s okay to be white” and the way that it moved from fly­ers that were plant­ed on col­lege cam­pus­es with a very plain mes­sage. There was no indi­ca­tion of who was doing this oth­er than if you knew where to look on 4chan, you knew that it was a cam­paign by white suprema­cists and trolls that basi­cal­ly asked peo­ple to put this fly­er up in pub­lic places that just says It’s okay to be white.” In Massachusetts some­body put up a ban­ner over the high­way, try­ing to get media and oth­er folks to take pic­tures of it. And this kind of viral slo­ga­neer­ing is some­thing that civ­il soci­ety orga­ni­za­tions have real­ly had to reck­on with and call atten­tion to so that peo­ple under­stand every [utter­ance] of this slo­gan is meant to cre­ate the con­di­tions by which peo­ple would dis­cuss racism and race but through the lens of white­ness and to try to nor­mal­ize dis­course about white identity.

And then pub­lic ser­vices, one of the case stud­ies that we have up there is a case study about Maxine Waters’ forged let­ter. The let­ter looks as if— I mean, there’s a lot of dig­i­tal foren­sics that make you real­ize very quick­ly that this is a forgery. But it was a let­ter say­ing that— It was a let­ter writ­ten from Maxine Waters,” to a bank basi­cal­ly say­ing If you donate a mil­lion dol­lars to me I will bring—” I think the fig­ure was like 38,000 immi­grants who are all going to need mort­gages to this area. And so if I win, then we all win” kind of thing. And this let­ter was plant­ed by her oppo­nent, and then through the uses of bots and oth­er kinds of auto­mat­ed tech­nolo­gies was pro­mot­ed online. 

But it actu­al­ly end­ed up with the FBI hav­ing to get involved. And we’ve seen numer­ous instances now, espe­cial­ly dur­ing the pan­dem­ic, where law enforce­ment are now being called up with these rumors and they’re being asked you know, will you step in and deal with, you know, antifa set­ting fires in my neigh­bor­hood? And law enforce­ment are like, Where is this com­ing from? Like why now are we being asked to deal with mis­in­for­ma­tion at scale and these kinds of rumors?” And of course elec­tion offi­cials as I men­tioned ear­li­er are also being called into the fray. 

So what does it all mean? I wrote in MIT Tech October 5th of this year—of…still this year; yeah, it’s December 1st, rab­bit rab­bit. I wrote this piece called Thank you for post­ing. And I make this argument:

Like sec­ond­hand smoke, mis­in­for­ma­tion dam­ages the qual­i­ty of pub­lic life. Every con­spir­a­cy the­o­ry, every pro­pa­gan­da or dis­in­for­ma­tion cam­paign, affects people—and the expense of not respond­ing can grow expo­nen­tial­ly over time. Since the 2016 US elec­tion, news­rooms, tech­nol­o­gy com­pa­nies, civ­il soci­ety orga­ni­za­tions, politi­cians, edu­ca­tors, and researchers have been work­ing to quar­an­tine the viral spread of mis­in­for­ma­tion. The true costs have been passed on to them, and to the every­day folks who rely on social media to get news and information.
Thank you for post­ing: Smoking’s lessons for reg­u­lat­ing social media

So if we were to try to restore moral and tech­ni­cal order, I got like a list, I got lists all over the place of things that I think we could do, but I’d love to dis­cuss some of these with you. 

I think we need a real­ly good plan for con­tent cura­tion cou­pled with trans­paren­cy and con­tent mod­er­a­tion. I’ve argued in the past for hir­ing 10,000 librar­i­ans to help Google and Facebook and Twitter sort out the cura­tion prob­lem. So that when peo­ple are look­ing for accu­rate infor­ma­tion and they’re not look­ing for opin­ion, they can find it. You know, if you think about Google search results, the things that become pop­u­lar are the things that are free. Anything that’s behind a pay­wall is not some­thing that peo­ple are gonna con­tin­ue to return to. And so as a result, Google search becomes the kind of qual­i­ty of a free bin out­side of a record store, right. Every once in a while there’s a gem at the top, but not usually. 

We also need a dis­tri­b­u­tion plan for the truth that sup­ports pub­lic media. And social media com­pa­nies must deliv­er time­ly, local, rel­e­vant, and accu­rate infor­ma­tion. We’ve seen this hap­pen with pan­dem­ic infor­ma­tion, where there’s lots of these like yel­low ban­ners that are show­ing up on web sites. That’s not a plan, that’s like…a sticky note. So we need some­thing else. 

We need to devel­op a pol­i­cy on strate­gic ampli­fi­ca­tion that mir­rors the pub­lic inter­est oblig­a­tions of oth­er broad­cast com­pa­nies. When we think about strate­gic ampli­fi­ca­tion, if some­thing is reach­ing 50,000 peo­ple, when we think about broad­cast and radio…we have rules for that. So when some­thing is reach­ing a cer­tain amount of views, or a cer­tain amount of clicks, or a cer­tain amount of peo­ple rou­tine­ly, espe­cial­ly if it’s a cer­tain influ­encer, there has to be some kind of mea­sure that will help us under­stand when mis­in­for­ma­tion, or hate speech, or incite­ment to vio­lence is cir­cu­lat­ing at epic vol­umes; what is the pro­to­col that should exist across social media sites.

And then last­ly, I think some­thing that fell out of view but is still very impor­tant is that tech­nol­o­gy com­pa­nies, includ­ing large infra­struc­ture ser­vices, must fund inde­pen­dent civ­il rights audits where audi­tors are able to access the data need­ed to per­form inves­ti­ga­tions, includ­ing a record of deci­sions to remove, mon­e­tize, or ampli­fy con­tent. And so we need much more trans­paren­cy and this might come in the form of an agency that can deal with this. 

So these are four of the ideas that I’ve come up with off the top of my head, wrote on a back of a nap­kin and did­n’t give much thought to. I’m kid­ding. I spend my whole life steeped in this non­sense. And the only thing that sus­tains me through it, I think, is know­ing there are peo­ple like the good folks at Berkman Klein that want to deal with these prob­lems, and want to deal with them respon­si­bly but also under­stand that these are prob­lems that harm dif­fer­ent groups of peo­ple dis­pro­por­tion­ate­ly, and we’ve got a real­ly great whitepa­per up on media manip​u​la​tion​.org from Brandi Collins-Dexter specif­i­cal­ly about how COVID-19 mis­in­for­ma­tion man­i­fests in black com­mu­ni­ties online. 

And, so as we deal with the pan­dem­ic, as we deal with the ques­tions of moral and tech­ni­cal order, we’re real­ly just striv­ing to answer do we have a right to the truth. And if so, how do we get there? Right. And so that’s the thing that’s been my pain for many, many years now. But I’m hop­ing that through…you know, the next sev­er­al years of an admin­is­tra­tion that is uh…potentially, I don’t know—I don’t even know—sympathetic to deal­ing with harass­ment of women, the ways in which cer­tain com­mu­ni­ties are under­served online, par­tic­u­lar­ly black com­mu­ni­ties, deal­ing with the kinds of mis­in­for­ma­tion that’re per­vad­ing Latino com­mu­ni­ties as well, that we will get some­where on some of these issues. But it’s going to be a very very long process. 

A collection of tin robot toys arranged in a group, some with a raised arm as if waving, next to a list of Donovan's research team: Brandi Collins-Dexter, Emily Dreyfuss, Rob Faris, Brian Friedberg, Dwight Knell, Nicole Leaver, Gabby Lim, Jen Nilsen, Poornima Rajeshwar

So I just want to thank my team, and the folks that help me think through these prob­lems every day. And we are now open for questions. 


Moderator: Awesome. Thank you so much, Joan. So we def­i­nite­ly have a bunch of ques­tions com­ing in, and we’ll start with some of the ones that are get­ting some addi­tion­al thumbs up, just as those seem to be the most com­pelling from peo­ple. So one is from Madeline Miller and states, As a stu­dent cur­rent­ly doing a pro­fes­sion­al degree in library and infor­ma­tion sci­ence what can I do about being part of a team of 10,000 libraries work­ing on com­mu­ni­ty misinformation?”

Donovan: Yeah. I think like— You know, I would love for the ALA to step in and real­ly cre­ate a pro­gram that allows for—at con­fer­ences for this kind of thing to devel­op. As well, you know, there have been dif­fer­ent efforts to build a dig­i­tal pub­lic library. But I do think that we need more librar­i­ans’ voic­es embed­ded in indus­try. And it pains me to say that because ide­al­ly, as the utopi­an that I am, we would build a broad pub­lic infra­struc­ture that deals with these prob­lems and I’m like, very excit­ed for Ethan Zuckerman’s new lab at UMass to deal with some of these issues. But for now, we have what we have. And we do need folks to start to think about well, if we were going to take you know, let’s say twen­ty con­sis­tent dis­in­for­ma­tion trends and deal with them specif­i­cal­ly, how would we refor­mat search so that the first three to five things that peo­ple see are actu­al­ly things that have been vetted. 

Francesca Tripodi has this real­ly great report at Data & Society on search­ing for alter­na­tive facts. And one of the things that she dis­cov­ered in her you know, ethnog­ra­phy was that a lot of peo­ple believe the thing that they see first on Google has been fact-checked in some way, or vet­ted in some way, or is the truest thing, and it’s not. And so I think there’s a lot of work to be done by librar­i­ans to sort out our infor­ma­tion ecosys­tem and make good mod­els for how we would advance a knowl­edge-based infra­struc­ture rather than a pop­u­lar­i­ty or pay-to-play infra­struc­ture, which is what we have now.

Moderator: Alright, thank you. So this is one that came up ear­ly and I think it’s a ques­tion that often comes up in this realm. And this is like how does your def­i­n­i­tion of dis­in­for­ma­tion dif­fer from pro­pa­gan­da? And I might add to that you know, how is this chal­leng­ing giv­en that those are probably…for lack of a bet­ter word unsta­ble cat­e­gories with many of the dif­fer­ent enti­ties that we’re look­ing at here?

Donovan: Yeah. Yeah. [chuck­les] Well, the rea­son why dis­in­for­ma­tion is actu­al­ly an inter­est­ing cat­e­go­ry to work with is because it shows up in…you know, American dis­course pri­or to 2016 as some­thing that is sort of unique­ly Russian in the sense that these are kinds of cam­paigns that are asso­ci­at­ed with Russian polit­i­cal tac­tics, infor­ma­tion war­fare… But as the US starts to fig­ure out that there’s dis­in­for­ma­tion hap­pen­ing, you get this dis­course of fake news. And you know, Claire Wardle did a lot of work to try to tell peo­ple not to use that word because it was play­ing into polit­i­cal divides. 

And I’ll give you one anec­dote about why fake news was so treach­er­ous to work with, is because when you would talk with design­ers or tech­nol­o­gists at these social media com­pa­nies, they would just say, Well, you know, fake news…real news…like what’s the dif­fer­ence, it’s infor­ma­tion,” you know. And they did­n’t under­stand what we were actu­al­ly talk­ing about. Which is to say that they were using these pop­u­lar def­i­n­i­tions, and Trump had obvi­ous­ly made an ene­my out of The New York Times and CNN by then, say­ing they were fake news. But what we were actu­al­ly talk­ing about was some­thing that Craig Silverman had looked into when he was I think a Nieman fel­low, where there were these cheap knock­off web sites that were made to look like news but were real­ly just about click­bait and they’d come up with any old head­line that would make you want to click through the content. 

So fake news for us was a tech­no­log­i­cal prob­lem brought about by the mon­e­ti­za­tion of adver­tis­ing, where you had these fake web sites. But then the politi­ciza­tion of that term made it seem like well, when you’re talk­ing about fake news, you’re talk­ing about a polit­i­cal cat­e­go­ry. And so when dis­in­for­ma­tion start­ed to sort of become some­thing you could talk about and not have it be nec­es­sar­i­ly only aligned with dis­cus­sions of Russia, it felt like a bet­ter fit than deal­ing with par­tic­u­lar­ly the fake news discourse. 

But then on top of that, when Network Propaganda came out, of course, pro­pa­gan­da got put back on the map in a seri­ous way to look at the phe­nom­e­non of media elites who were using both the online and the broad­cast envi­ron­ment in order to cre­ate a zone of infor­ma­tion that was polit­i­cal­ly motivated. 

And so for me, when I think about pro­pa­gan­da I’m think­ing specif­i­cal­ly about you know, the way in which that book posi­tions net­worked pro­pa­gan­da as a kind of tool of media elites. But with dis­in­for­ma­tion, for us as research team we’re real­ly try­ing to look at the incen­tiviza­tion. And we deal a lot with fringe groups. We don’t nec­es­sar­i­ly… Of course we’re avid, avid watch­ers of Tucker Carlson. But inso­far as he’s like the shit fil­ter, which is that if things make it as far as Tucker Carlson, then they prob­a­bly have much more like…stuff that we can look at online. And so some­times he’ll start talk­ing about some­thing and we don’t real­ly under­stand where it came from and then when we go back online we can find that there’s quite a bit of dis­course about would­n’t it be fun­ny if peo­ple believed this about antifa.” So yeah, so that’s… I know it’s not a clean answer, but that’s sort of how we arrived at where we are.

Moderator: Awesome. Thank you. So, to fol­low with that, I think par­tic­u­lar­ly as you start to talk about kind of the media and those dis­tinc­tions, the next ques­tion we have is… I know this will be a more com­pli­cat­ed answer. Have any social media com­pa­nies expressed any will­ing­ness to work with groups like yours on squash­ing mis­in­for­ma­tion on their plat­forms? And I guess I’m think­ing about that par­tic­u­lar­ly as you were describ­ing kind of the fringe groups and how they’re tak­ing advan­tage of, and often the media com­pa­nies are prof­it­ing from that advantage.

Donovan: Yeah. I mean, we’ll take meet­ings with any­body, we just won’t take their mon­ey or data, right. And so the idea here is basi­cal­ly that you need to have a pret­ty… For my team espe­cial­ly, we have a pret­ty strict rule about how we get our data, the way in which we engage with plat­form companies—or any com­pa­ny. We will take meet­ings lit­er­al­ly with any­one, right. Because for us it’s not about…it’s not about them get­ting us to see it their way. It’s about us get­ting them to see it our way, right. Education when done well is we all arrive at a shared def­i­n­i­tion of the prob­lem. And so for us, engage­ments with dif­fer­ent com­pa­nies real­ly are about show­ing them some­thing they could­n’t see, because they were stuck in a spe­cif­ic mode of thinking. 

So for exam­ple if we think about all of the dif­fer­ent ways in which research could’ve been done about dis­in­for­ma­tion in the elec­tions. I think the approach of the Berkman Klein team around you know, look­ing at mail-in vot­er fraud very deeply using Media Cloud was the right approach. They did­n’t you know, go out and solic­it a bunch of infor­ma­tion and fund­ing from plat­form com­pa­nies that would mire them in ques­tions of their alle­giances and what­not, they just did what they knew how to do. And they were able to, like many of us that remain fair­ly inde­pen­dent, were able to see that the ques­tion was­n’t real­ly about how many mis­in­for­ma­tion cam­paigns we were going to see, big or small, it was real­ly about hav­ing the best knowl­edge we could have of the ones that were gonna have what looked like the most amount of impact because they had a cou­ple of sig­na­ture aspects to them, which is that media elites were pick­ing them up, polit­i­cal elites were push­ing them, and they were forc­ing a pub­lic con­ver­sa­tion that would­n’t have hap­pened if not for the design of our media ecosys­tem oper­at­ing in this way. 

One of the things I don’t think any of us could have pre­dict­ed, though, was the reac­tion for instance of large-scale media cor­po­ra­tions like The New York Times or The Washington Post, or even The Wall Street Journal, not to pick up that polit­i­cal pro­pa­gan­da and par­rot it back out, espe­cial­ly dur­ing the time of the, you know, dun dun dun, the Biden lap­top story. 

So I think that when we as research teams engage with indus­try, it’s real­ly, real­ly, cru­cial­ly impor­tant, I can’t under­score this enough, to go with the method that you know. And to go with the mode of analy­sis that is most authen­tic to what it is that you’re try­ing to study. So for us it’s most­ly qual­i­ta­tive dig­i­tal ethnog­ra­phy. Like we watch the con­tent, we under­stand who the play­ers are, we under­stand the scene, and we take that pret­ty seri­ous­ly. And so as a result, we don’t get stuck in you know, ques­tions about oh, did this have any impact, or did this do that, or blah blah blah, or like— You know, what we can show very clean­ly is these are the pro­gres­sion of facts, and we can show empir­i­cal­ly how these things scaled, and then we can look at the kinds of recep­tion and mit­i­ga­tion attempts that plat­form com­pa­nies have had. And then we can eval­u­ate them based on the ways in which manip­u­la­tors either choose to aban­don that project, or they choose anoth­er route. And so that’s dif­fer­ent from oth­er research hous­es that do very sim­i­lar kinds of things. 

And I think the oth­er chal­lenge jour­nal­ists have is a very sim­i­lar one, which is none of us want to get stuck in the role of becom­ing like, an arm of the indus­try just doing con­tent mod­er­a­tion. That’s not what’s an inter­est­ing piece of the puz­zle. For me, the inter­est­ing piece of the puz­zle, and this is prob­a­bly just cuz I’m a nerd, is like, how do we make sure peo­ple can access accu­rate infor­ma­tion and make deci­sions about their lives based on that infor­ma­tion. As well, like, I also wan­na have a lit­tle bit of fun. So I enjoy a good prank, and I enjoy a good hoax. But I think that by and large when we’re deal­ing with mis­in­for­ma­tion at scale, it’s a fea­ture of these indus­try prac­tices and there­fore we can’t assume that the indus­try is gonna be able to see itself for what it is. 

I think we have time for maybe one more question.

Moderator: Exactly what I was think­ing. So, this one I think ties along with that last one and is a good kind of wrap-up ques­tion from [Charmaine?] White. Can you elab­o­rate the con­cept of a dis­tri­b­u­tion plan for the truth? How is it pos­si­ble for social media com­pa­nies to deliv­er time­ly and accu­rate infor­ma­tion when com­mu­ni­ca­tions on them are instan­ta­neous in real-time, and the num­ber of con­trib­u­tors to these net­works is every-expanding?”

Donovan: It’s a hard prob­lem and needs more research. And that’s why I real­ly val­ue the work of librar­i­ans on think­ing through these kin­da of tax­onomies that we’re gonna need, and the kinds of ways in which we might want to hold out cer­tain cat­e­gories. I’m think­ing here of Deirdre Mulligan’s work on rescript­ing search. She’s got this beau­ti­ful paper about do we have a right to the truth. So what hap­pens when you Google Did the Holocaust hap­pen?” So back when she was writ­ing her paper…ter­ri­ble things hap­pened when you Googled Did the Holocaust hap­pen?” You were actu­al­ly brought direct­ly into anti-Semitic groups who would you know, post lit­er­al­ly every sin­gle day anti-Holocaust, anti-Semitic content. 

And this is an expe­ri­ence I had as a researcher look­ing at white suprema­cists’ use of DNA ances­try tests. It was­n’t the case that when you were look­ing up cer­tain kinds of white suprema­cist claims that you were giv­en infor­ma­tion about white suprema­cist groups and why they were bad. And the SPLC does a real­ly good job of track­ing that, it just was­n’t ris­ing to the top of Google. What was were…you know, white suprema­cist groups like Stormfront. And so for me it was real­ly impor­tant to think through those ques­tions of well, what do you get when you search for X? What do you get when you search for Y? And then how do those algo­rithms rein­force that? 

danah boyd and I wrote about self-harm. And for a while if you were to look up how to injure your­self, you would­n’t just get you know, tuto­ri­als, you would also be remind­ed that you searched for that over and over and over again, on YouTube and Instagram and oth­er places that did­n’t real­ly have great restric­tions on that kind of con­tent. And this goes back to you know, ear­ly dis­cus­sions about Tumblr and pro-anorexia blogs. 

And so I think now is the point to under­stand that we’ve reached a kind of crit­i­cal mass with social media and deal­ing with infor­ma­tion at scale is just pro­found­ly dif­fer­ent than deal­ing with rumors or hoax­es that kind of stay local. Because social media com­pa­nies have focused for so many years on increas­ing scale, increas­ing infor­ma­tion veloc­i­ty, we now have to have a bunch of dif­fer­ent pro­fes­sion­als deal­ing with it in like, real­ly slap­dash kind of ways. Like the ways in which jour­nal­ists have had to take up the prob­lem of media manip­u­la­tion because they have been tar­get­ed by it, the way in which elec­tion offi­cials have had to deal with it this year. It’s beyond a kind of quick tech­no­log­i­cal fix. We actu­al­ly need a pret­ty robust pro­gram to deal with the cura­tion prob­lem so that when you do search for how to vote, you get infor­ma­tion about what’s par­tic­u­lar to your area. 

And it’s only recently—like I can­not express to you enough how recent it is that these com­pa­nies have been will­ing to make those changes. Prior to that, it was like if you asked me in 2016 if we were going to get any trac­tion on deal­ing with these white suprema­cists that were orga­niz­ing online, that were ral­ly­ing at Trump ral­lies, meet­ing one anoth­er, grow­ing their ranks, expand­ing their pod­cast net­works, I woul­da said no, absolute­ly not; these com­pa­nies are com­plete­ly unable to face what they have built because they did­n’t think about the neg­a­tive use cas­es. They did­n’t think about how dif­fer­ent peo­ple, fringe groups, would rise to the top and have an incred­i­bly out­sized impact on our culture. 

The ques­tion of pro­pa­gan­da then comes into full view as Trump kind of…pardon the meme but assumes his full form as the pres­i­dent who is try­ing to defend him­self from a lost elec­tion. This has just kind of thrown every­body on their heels in this field, try­ing to map and under­stand what the real prob­lem is. I was talk­ing with Jane Levchenko at BuzzFeed and she was say­ing, I did forty-four debunks in two weeks and I’m tired, and maybe it’s not work­ing any­more. Maybe you can try to debunk every­thing but it’s just not going to hold. The gates again have broken.” 

And so we just…we need to have more think­ing of course, and like, as a nerd, more research. But I do think the pos­si­bil­i­ties of solv­ing this prob­lem lie between these dif­fer­ent pro­fes­sion­al sec­tors. And that’s why I think the mul­ti­dis­ci­pli­nary approach to this prob­lem where we try to get every­body’s con­cerns on the table, we try to under­stand how to nav­i­gate that, and then with a lit­tle bit of help from groups like Jonathan Zittrain’s Assembly project, we can start to have those deep­er con­ver­sa­tions that ask the ques­tion you know, how do we get beyond mis­in­for­ma­tion at scale, and whose respon­si­bil­i­ty is it, and how much is it gonna cost. And how do we end up with the Internet that we want rather than this like…thing that we…you know, the thing that we have, which cur­rent­ly isn’t working. 

And the pub­lic health impli­ca­tions right now are real­ly where my mind is at because ulti­mate­ly, this kind of mis­in­for­ma­tion at scale, politi­cized med­ical infor­ma­tion around masks, around cures…this is like…this is gory stuff. Because you know, we’re gonna look back on this moment his­tor­i­cal­ly and every one of us is gonna won­der did I do enough, and how could I have done bet­ter. And that’s why I think it implores us all to try as much as we can to get involved, to think through these prob­lems and to— And our method is think­ing with case stud­ies, because we want to think about things in-depthly and then we want to extract from them higher-order def­i­n­i­tions and principles. 

So, that’s where my mind’s at. But I appre­ci­ate every­body tun­ing in. We’re gonna have sev­er­al mis­in­for­ma­tion train­ings start­ing every month start­ing in January. And we’re going to have lots of oppor­tu­ni­ties for peo­ple to write for Casebook as well. So I’m real­ly look­ing for­ward to that as the next iter­a­tion of this project.

Further Reference

Event page