Oumou Ly: Welcome to The Breakdown. My name is Oumou, I’m a staff fellow on the Berkman Klein Centers’s Assembly Disinformation Program. Our topic for discussion today is the upcoming November 2020 election. The 2016 election in so many ways laid bare the connection between countering disinformation and protecting democratic processes, and I think that it’s been difficult since the 2016 election to quantify the causal impact of disinformation on public confidence and democratic institutions. We know that foreign actors worked to systematically influence the election and we know that they’re working to do the same this fall. My guest today is Brian Scully from the Department of Homeland Security, who will introduce himself.
Brian Scully: Sure. Currently I run the DHS Countering Foreign Influence Task Force, which is in the Cybersecurity and Infrastructure Security Agency. And I’ve been doing that for about two years now.
Ly: Awesome. So, the election, with all that is going on in the country right now seems to be a little bit of a third thought, behind COVID and ongoing protest activity related to police executions. But I want to start talking about the election during our Breakdown series because we know it’s one of those areas that is just so ripe for disinformation, particularly the kind that we’re concerned about in the Assembly program, which is state-backed coordinated disinformation operations. Can you talk a little bit more in depth about what you do at DHS on CISA.
Scully: Sure. So like I said I lead a small team. We’re focused on what we call countering foreign influence but really what we’re trying to do is build national resilience to foreign influence activities. And so for us a lot of what we do is public education and public awareness outreach to different communities, provide resources that folks can use to better understand both the risk and then ways to mitigate the risk. For us in particular we’re trying to reduce the amount that Americans engage with disinformation. And so a lot of our products and information are focused on that.
We’re also working…specific to the election we work a lot with state and local election officials to help them with content around the security of the election, right. So helping citizens understand how to vote, where to vote…you know, how absentee ballot works, things like that. We do that in support of state and local election officials.
And so our two main areas are really trying to help folks understand what’s going on around the 2020 election, and then building broader kinda long-term resilience to disinformation amongst the American people.
Ly: Awesome. So, one of the first substantive questions I have for you in terms of the contrast between what happened in 2016 and what we expect to observe this fall is maybe just generally what have we learned about state-backed information operations since 2016? What were the big takeaways you’d say, from your perspective?
Scully: Sure. So we’ve actually learned a lot. And I think you know, a lot of it can be—a lot of what we draw from is one, the Mueller report and the indictments. There’s a lot of detail and information about how the Russians acted and behaved and what they did in 2016. The Senate Intel Committee reports have been a fantastic resource. And since 2016 the research community in particular has been extraordinary in terms of really diving deeply into what happened in 2016 and what we’re seeing since then.
So I think a few of the important things we’ve learned since 2016 from those reports is one, we got a better sense of both the sophistication and the reach of Russian efforts in 2016, right. So, it was really interesting for me to read about how the Russians had agents on the ground in the United States trying to better understand our political environment, for example. It was really interesting to learn and understand how they tried to set up protests, right, where they would have both sides protesting against each other, again to take things so it wasn’t just on social media but to bring things into the real world and to have people really acting and behaving a certain, and trying to get conflict and conflict and conflict. So those tactics were super interesting.
The other thing that was really interesting is really seeing how they were running it like a marketing or advertising campaign, right. They were A/B testing messaging. They were using data analytics to understand what narratives and messages were most effective. And then they were tailoring what they were doing based on that. And so those are kinda the two big things from a tactical standpoint and then from a marketing, advertising, communications standpoint were super fascinating.
For me personally, the area that I’ve learned the most is really the psychology behind disinformation, right. Why disinformation works. And starting to see a lot of the research coming out about the psychology of disinformation; why disinformation works on humans; and Americans, why people kinda jump into it. That’s really…you know, we talk a lot about tech solutions to disinformation. For me, tech solutions are helpful, but the real problem is a human problem so how do we better understand the human aspects of disinformation, and then how can we work to mitigate those sorts of things.
Ly: One of the other questions I had was related to what you mentioned earlier about protest activity? How IRA agents staged counterprotests on the same issue in the US. And we’re in such an interesting world in 2020 where they don’t have to do any of that work at all. Those protests are already going on. Are you seeing, or is DHS seeing, any attempts by state actors to use the ongoing protest activity here for their own geopolitical aims?
Scully: So it’s been interesting about the current protests…the short answer is yes. But what’s been interesting about it is it’s been mostly overt, open communications, right. It’s not…I’m sure there’s some covert activities going on where they have their false accounts and and things like that. But if you just look at state-run media, if you just look at diplomatic communications, right, where the embassies are tweeting things out or posting messages, it’s very overt communications where they’re trying to take advantage of the discord in the United States so that you know, again the goal of nation-state actors is to reduce the strength of the United States so that they have a better chance to achieve their strategic goals. And so anything they can do to kinda undermine the legitimacy of the United States, undermine the legitimacy of democracy, they’re going to take advantage of that and this is obviously a perfect opportunity to do that.
And you know, it’s a legitimate…right, it’s a real issue here in the United States that we have to deal with and those are the most effective ways to create fissures and divisions. It’s just, you know, it’s just reality, unfortunately.
Ly: Yeah. Have you seen any changes in how state actors are maybe seeding or planting disinformation online?
Scully: Sure, absolutely we’ve seen some changes. You know, as the platforms and as governments have become more aggressive in terms of dealing with automated accounts, and bots, and inauthentic activity, bad actors of course continue to change their tactics and their tools. So we’ve seen a few things I think that are important.
One, we’ve see new actors, both state and non-state actors, really take on the Russian playbook from 2016. So the first thing that we’re seeing is just a lot more players in the field, right. It’s not just the Russians who’re pushing it, it’s a range of state actors. And then more importantly I think, and more challenging certainly from a government standpoint is we have a lot of domestic actors that are taking the playbook from 2016 and leveraging it for their own purposes now. So that’s I think first and most important.
From a more tactical level we’ve also seen a few different changes. So, state actors now are much more focused on amplifying—as we were just talking about—amplifying existing narratives that are—you know, exist in the United States, right. So in 2016 there was more where the state actors were creating narratives and then trying to amplify the narratives that they’d created. Now what we’re seeing more of is where they’re just jumping on and getting behind narratives that are being pushed by American citizens, right. So they don’t have to create the content themselves, they just jump onto existing narratives and disinformation that’s being pushed.
So that’s one. Two, we’ve seen more leveraging of proxies, right. And so back a month or two ago, there’s a great story that CNN did using some really excellent research from a variety of different researchers where they actually sent a reporter to Ghana, where Russians were leveraging a Ghana social media marketing company to do disinformation in the United States. And so we’re seeing a lot more of that where they’re trying to identify local proxies because it makes it much more difficult for the platforms in particular to identify inauthentic activity, right. And so we’re seeing more of that. So that’s a super fa—if you get the chance and…I don’t think as a government official I’m allowed to promote a particular news agency, but if you get a chance, that story and that research is really fascinating to see how it’s actually being done on the ground and how it actually works.
Another thing we’re seeing a lot of is what we call disinformation as a service. This is where companies are essentially offering disinformation services, right. You can go hire somebody to run disinformation on your behalf. And so again it just opens up the playing field for the number of actors who can kinda get involved and be involved in disinformation activities.
And then of course you’re just seeing kind of ramping up of old becomes new, so forgeries, things like that. We’ve seen some where rather than trying to subvert reporters they’re creating fake reporters and fake news sites to push articles and things like that. And then you see a lot of leveraging of overt media activities, right, and connecting state-run media with narratives and using the state-run media to amplify and things like that as well.
So there’s a lot going on. I think we’ll see a lot of what we’ve seen in the past, but then you’ll see some of these new items and some tweaks and things like that to try to make it more effective.
Ly: Yeah. I know one challenge that came out of the Mueller investigation was that there’s a pretty clear avenue for persecuting state actors or non-US persons who participate in these operations, but not so much for US persons, American citizens. Can you talk a little bit about how the government is responding to that difficulty since 2016?
Scully: Yeah. So…I mean I think that’s an important question, right. Foreign actors can break the law, and they can be investigated and hopefully prosecuted. Obviously prosecution can be difficult. For US citizens, there’s First Amendment protections for speech in kinda how we deal with that. So it’s a difficult topic. I mean there’s a lot of incentives for folks to push and conduct disinformation operations for their personal gain or political gain within the United States. And that creates a challenge, right. We can’t investigate First Amendment-protected speech, you know. The FBI and DOJ can’t do anything about that. And DHS is certainly not getting involved in trying to do anything around domestic speech.
So that is the value… You know, that is essentially bad actors taking advantage of our freedoms, right. American citizens, we can post up our opinions however we want, whenever we want, and foreign actors can come in and find the worst of those opinions or the most divisive of those opinions and amplify them. And so it is definitely a particular challenge when domestic actors are participating and active in information. And I don’t know that there’s a good answer to that problem. Certainly not yet
Ly: Yeah. What do you see as the government’s overall role in combating disinformation?
Scully: That is the $64,000 question, right? Yeah that’s a good question. And I think— You know the first thing, and this is something I grapple with regularly. It was something when I was up in the fellowship, that I talked to my fellow…my fellow fellows about, quite a bit. So I think it’s a really difficult question. And it’s not just a difficult question about what’s the government’s role, I think we have to be talking about what the platforms’ role is, what tech companies’ role is, what civil society’s role is, and what the individual’s role is in dealing with and combating disinformation.
So from the government side, I think what we’re doing now is essentially leveraging the authorities and what we can legally do, right. So, we have the intelligence community helping to understand the threat, identifying bad actors, doing the things that the intelligence community does. And does very well, right. And then we have our law enforcement agencies, the FBI, the DOJ, right. They’re investigating when the law is violated, when the law is broken, they’re investigating, indicting, prosecuting the bad actors. And that’s great, and that’s a piece of it.
And then we have to build—what DHS is doing, we’re trying to build public resilience, right. We’re trying to help the American people understand the role that they play and the steps that they can take.
And then of course at a broader level, particularly for state actors, we need to establish deterrence measures. And I believe the state department and the National Security Council, interagency more broadly is kind of working on how can we deter state actors from trying to interfere in what we’re doing.
And so I think those are the things that government can do now. And that’s what we’re trying to do now, within the authorities we have. The question is you know, we have people calling for more monitoring of speech, right, on platforms. We have to tell the platforms that this is a line, they need to take it down. Or we’re asking the platforms to do that. And so you know, that gets into protected speech and First Amendment rights, and I think those are super difficult, challenging questions we have to deal with.
And there’s things as government we deal with every day, not only the free speech issues and First Amendment protections, but privacy issues and those sorts of things. As you know, attributing an actor on social media is difficult, right. I mean, anonymity is kinda the thing on social media for a lot of these platforms. And so it’s very difficult to identify and attribute who’s saying something. And so if you don’t know who’s saying something it could be an American citizen, how do you deal with that? If you know it’s a foreign actor, like it’s a little different approach to it, right. From a government standpoint it opens up some different avenues for us. But if you don’t know or if it’s potentially an American citizen how do you deal with that differently.
So, my view is I think we need a broader national discussion on roles and responsibilities. You know, what’s the government’s role, what’s the platforms’ role, what’s the tech companies’ role, what are the individual citizens’ role, how can we leverage civil society in this space? I don’t know that we’ve gotten there yet. But I think that conversation needs to occur. And so the government right now, I think we’re leveraging our authorities. I think certainly from a DHS standpoint we try to push as much as we can but we’re also super cautious. So I don’t know how you can be aggressively cautious but I think that’s generally our approach. Because we’re very very concerned about privacy and First Amendment protections and things like that. And I know that our interagency partners feel the same way as well.
Medium post for this episode, with introduction and edited text