Oumou Ly: Welcome to The Breakdown. My name is Oumou. I am a staff fellow on the Berkman Klein Center’s Assembly Disinformation Program. On our program today we’re talking with Claire Wardle, who by way of her Berkman bio is a leading expert on social media, user-generated content, and verification. She is the cofounder and leader of First Draft, which is the world’s foremost nonprofit on research and practice to address mis- and disinformation. Thank you for being with us today Claire, I really appreciate you joining us.
Claire Wardle: It is my pleasure. Thanks for having me.
Ly: Awesome. So our conversation today really centers on the interplay between disinformation and what we call the legacy or professional media ecosystem. Certainly over the past couple of months since the pandemic has really ramped up and in some in some ways paralleled and turned into an infodemic or sorts, we’ve seen various points at which it appears that there is a relationship, or even more specifically a pipeline between the online information ecosystem which seeds disinformation into the mainstream media.
So I have a couple of questions for you about that. And my first one is really about the pathway that false content may follow when it starts maybe in really obscure corners of the Internet, maybe through fringe news sites, and then it eventually makes its way more mainstream. So can you talk about that process?
Wardle: Yeah, so I think those of us who study and think about mis- and disinformation, it’s very tempting to study what’s in front of us. And so there’s a disproportionate focus on Twitter, because it’s the easiest to study because there’s an open API—although, caveats—and Facebook and— That’s a lot of the places that we study. And similarly, that’s a lot of the places that journalists look for content and sources and stories.
And so we end up kind of really just thinking about that as the “problem,” when actually we need to think about the full ecosystem. And it’s not always the case, but there is certainly examples of some of these conspiracy theories, or kind of trending campaigns, or inauthentic activity being coordinated in spaces for example like a Discord, a guild, or it might be 4chan. It might be some of these pretty small spaces. And it would be normally easy to dismiss them and those conversations because you can see people trying to coordinate, but you don’t necessarily think that’s going to go anywhere. And in a lot of cases it doesn’t. But sometimes you see this basically trading up the chain which is, this term has been around a for long time, we talk about this as like the Trumpet of Amplification because you can see it then move into other spaces that might be WhatsApp groups or Twitter DM groups, where the coordination gets a little bit more strategic.
You then maybe say that move into communities maybe on YouTube, or even Reddit, or Gab, or places that are technically public, but these communities often you’re not spending a lot of time in, they’ve kind of grown up as particular fringe-type communities that journalists are not spending time in. And from there, you see it jump into Instagram, YouTube—the YouTube that you and I spend time in, Instagram or Facebook. And it’s at that point that journalists tend to find it.
And the problem is they don’t necessarily understand that this has come…that there’s a history to this. That there’s potentially been any kind of coordination. And at that level, unfortunately we sometimes see politicians repeating the conspiracies, or influencers repeating the falsehoods. And then at that point, you see the media make a decision that says, “Ugh, we’ve now got to cover it, because it’s being pushed by a particular influencer or politician.” But that was part of the plan. That was the aim, which was to get the media to cover it.
And you know, the other complication here is sometimes the media—even if a politician doesn’t talk about it, there’s a sense of hang on, these rumors, conspiracies, falsehoods, fabricated media…it’s got to a point where actually we have to “debunk” it. And unfortunately, if newsrooms debunk in an irresponsible way, then that itself is the end goal of these kind of “bad actors,” even though that’s a terrible phrase. The fact that newsrooms are reporting on it, they have a megaphone that many of these fringe communities just don’t have. So the role of the professional media in this whole ecosystem is critical, as is the way we understand politicians and influencers. Because you can’t think about mis- and disinformation in 2020 without understanding their roles.
Ly: One difficulty that I hear reporters and journalists talk a lot about is this decision they have to decide whether or not report it, given that difficulty, right. You want to prebunk, debunk, or just maybe even do simple fact-checking, but you can’t do that without the risk of reamplification. Given the fact that most major print, broadcast audiences are networked, how do reporters go about doing that really critical fact-checking, debunking, or prebunking work without risking inadvertently reinforcing those very harms they’re trying to mitigate?
Wardle: So, it’s a great question, because the challenge that newsrooms now face is that they have different platforms that they need to consider. So when we do training with journalists, we talk to them a lot about the work of danah boyd around data voids, which is actually on certain, for example conspiracies… If mainstream media don’t do anything to debunk those conspiracies, if somebody hears about that on their family WhatsApp group and they go to Google and they type in you know, “5G coronavirus,” if there’s no debunking there, then all they get on Google is the conspiracies. So when it comes to information designed for Google and YouTube, newsrooms actually need to like, be creating this content and creating a headline that will get picked up by search.
However, if you’re thinking about a tweet or Facebook post that people are stumbling across, you have to be careful not to give oxygen to a rumor that they might not have heard about, because unfortunately our brains are really bad at making sense of truth and falsity. And even if somebody tells us it’s false, a week later when people go back to [study it?] they’re like “Oh, somebody said something about…Obama being a Muslim, I can’t remember now is he or not,” you know. We’re really really bad at making these distinctions. And so we need to be more careful about ensuring that we don’t tell people rumors that they haven’t heard before.
So it’s really difficult to make sense of all of these things, and there’s no hard and fast rules? But all of this is happening within an economy, where newsrooms are increasingly struggling. We would be naïve to not recognize that some of these headlines, some of these these debunks, actually get a lot of traffic. So, as story…I think of 2012, there’s a now very famous YouTube video of an eagle stealing a baby in a park. And it’s pretty like, “Oh my goodness!” And then it transpired that it was actually a university in Canada—they’d been given an assignment to create a video that would fool journalists.
And I was working with a newsroom at the time that ran the video, and I was like, “Oh my goodness, are you mortified that you ran this?”
And without missing a beat the digital editor was like, “Well no, the debunk will get twice the traffic.” So like, there was a recognition that you know, we have to be wary, be aware of that when we have these discussions.
Ly: Right, is there an understanding within the industry that this is a problem and there’s a wholesale shift of thinking that needs to happen?
Wardle: I would argue in the last two years there have been many more discussions in newsrooms about the role that they are playing in the information ecosystem now that we have real challenges with information pollution. And some of the work by Whitney Phillips or Joan Donovan has really I think forced some internal conversations about this. And so what you see is newsrooms now saying, “Well, is it right that we used that particular keyword, because now we’ve learned that if we use the keyword, that actually will then send people to conspiracy theories online. Maybe that’s not sensible.”
So it is far from where we need it to be, but we’re asking for a pretty big paradigm shift. You know, within the news industry there has always been this idea that more sunlight is a disinfectant. That by “holding people to account,” by reporting on problems, that actually—then that will help. And the challenge here is, these bad actors that we started talking about at the beginning of the interview that potentially are sitting on 4chan or Discord or in WhatsApp groups coordinating, they really…that’s their whole end goal, that they will get that coverage.
But it’s very difficult— And in trainings, when you show journalists like, “Listen, this is a discussion that they are having about you as journalists and how they can manipulate you into covering them.” And it’s only then you kind of have this lightbulb moment from the journalists who say, “Well that’s not why I went into journalism. I did not go into journalism to help these people get more coverage.” And I think when you have a discussion about every newsroom thinks very strategically about how to cover a press release, how to cover suicide, how to cover troop movements during a war, I think then there’s a sense of, “Oh okay, this isn’t anything new. We just have to be aware of the unintended consequences of our coverage, which previously we didn’t have to think about what we were covering disinformation.”
Ly: Yeah. That really seems to relate to what was going to be my next question anyway, about the structural issues that give way to this dynamic. In what ways do you think that really intrinsic link between the large and legacy professional media organizations and corporate advertising play into the way this plays out?
Wardle: Well, I mean we’ve seen some great journalism over the last four years in particular, kind of really taking to task the platforms and thinking about the way that the business model of the platforms drives disinformation. But I don’t think we’ve had the same conversation about how that also plays out in the news industry. I think again, we’re being naïve if we don’t recognize that the selection of certain stories, the framing of certain stories within the news business is designed for clicks and traffic and ad revenue. And because of that you know, I think there have been stories written that have unfortunately done more harm. And again, it’s very difficult in the moment to have these conversations, and I would say that over the last couple of years I see much more reflection from newsrooms about these unintended consequences. But again, sometimes the people who are thinking about this aren’t necessarily the people who write the headlines, or ultimately decide what to cover, so that’s the challenge.
Ly: Yeah. Okay, I want to shift gears for a second and talk about politics. Because political reporting is really ripe for disinformation in a way that’s unique to it. I mean it’s one of those kind of interesting sectors where the material harm of disinformation is pretty immediately recognizable because of the way so much of politics plays out in the public sphere. And I know that certainly over the last ten years there’s been a growing conversation about the relationship…maybe not the relationship but better yet the tension between balance and objectivity in political reporting. And the goal…when you listen, really is to really indemnify news organizations of any claims of bias by either political party. So can you talk a little bit about that debate and how you see that playing out maybe in the context of the COVID infodemic?
Wardle: I mean the COVID situation’s so interesting because we’ve all sat around and been like wow, this is health misinformation. This just goes to show that the platforms could be doing much more than they’ve already been doing. But already we’re seeing how COVID is now becoming interlocked with political conversation. So for example monitoring the excessive quarantine protests recently, and seeing the online conversations and understanding that there are a lot of anti-vax groups pushing that, there are lots of Second Amendment gun rights people pushing that. It’s get very…complex when you start adding…
Ly: Very interesting. Yeah.
Wardle: And so this idea that there’s health misinformation or there’s political misinformation or disinformation, these boundaries are actually very difficult. But the challenge of reporting all of this is you know, as the networked propaganda book showed us last year, this is asymmetrical. And so because of that it’s very difficult to sometimes tell these stories, make sense of this landscape, for journalists who have been absolutely trying to within inch of of their life to always take both sides? And in the same way, there is the truth that most newsrooms do tend to have people who sit more on the left wing. And so they’re already trying to counter what they perceive as potential “bi—” well, they wouldn’t see it as bias, but everybody’s aware of how that might play out. So it’s a really problematic space for everybody, because people are trying to use their training to do journalism, but as I just said the challenges that journalists now face, they weren’t taught about this in journalism school. They weren’t taught about how do you cover disinformation? Because journalism is about covering the truth, it’s not about covering the falsehoods.
So now, there’s this situation that journalists find themselves in that’s really hard. And of course the trust question of how do you report on these spaces, knowing that it’s important that certain communities receive quality information, yet knowing that those communities are much less likely to trust the professional media? I think the fear I see is I see an increasingly polarized country politically, but also when we look at consumption of professional media, I see half the country going nowhere near the professional media. And those people are actually more likely to be recipients of misinformation. It’s a real problem. And I don’t see an easy way out of this.
Ly: Thank you so much for joining us today Claire. I really appreciate it. We had a great conversation.
Wardle: Thank you very much.
Further Reference
Medium post for this episode, with introduction and edited text