Of course we’re avid, avid watchers of Tucker Carlson. But insofar as he’s like the shit filter, which is that if things make it as far as Tucker Carlson, then they probably have much more like…stuff that we can look at online. And so sometimes he’ll start talking about something and we don’t really understand where it came from and then when we go back online we can find that there’s quite a bit of discourse about “wouldn’t it be funny if people believed this about antifa.”
Archive (Page 1 of 3)
We are immersed in a hyperpartisan media ecosystem where the future of journalism is at stake, the future of social media is at stake. And right now I’m really worried that the US democracy might not survive this moment.
Not all dis- and misinformation is foreign, so that’s why this is such a large problem because there are many domestic actors that engage in disinformation campaigns as well. So, the narratives that we’ve seen across the space come from so many different people that sometimes it can be hard to target the the problem to one particular actor or one particular motive.
I want you to know that they are just pathmakers and ‑breakers in their field. There’s a way in which you’re taught to be a scholar and you’re taught to be pragmatic in the choice of your projects, you’re taught to be careful in the ways in which you speak in public, and these two do it better than anyone I know.
In this moment that we’re in today with technology, where we’re I think shifting finally into a mode where it’s possible to be critical without getting sneered at, if we kind of look back at the…I don’t know, the optimistic aspirationalism that we’ve been using to encounter technology in the broadest sense, and we look back on those moments of the recent past or even the distant past, we can see how we knew how things were going to turn out, actually. We just weren’t paying them heed.
The key thing that Congress realized…was that if you want platforms to moderate, you need to give them both of those immunities. You can’t just say, “You’re free to moderate, go do it,” you have to also say, “And, if you undertake to moderate but you miss something and there’s you know, defamation still on the platform or whatever, the fact that you tried to moderate won’t be held against you.”
We’re focused on what we call countering foreign influence but really what we’re trying to do is build national resilience to foreign influence activities. And so for us a lot of what we do is public education and public awareness outreach to different communities, provide resources that folks can use to better understand both the risk and then ways to mitigate the risk.
I think those of us who study and think about mis- and disinformation, it’s very tempting to study what’s in front of us. And so there’s a disproportionate focus on Twitter, because it’s the easiest to study because there’s an open API—although, caveats—and Facebook. That’s a lot of the places that we study. And similarly, that’s a lot of the places that journalists look for content and sources and stories. And so we end up kind of really just thinking about that as the “problem,” when actually we need to think about the full ecosystem.
The question also does come up, you know, is there anything really new here, with these new technologies? Disinformation is as old as information. Manipulated media is as old as media. Is there something particularly harmful about this new information environment and these new technologies, these hyperrealistic false depictions, that we need to be especially worried about?