Jenny Toomey: We're bringing up Brendan Nyhan, who is coming to us all the way from Dartmouth College. Brendan in his past ran with some colleagues of his a terrific watchdog political spin called Spinsanity between 2001 and 2004, which was syndicated in Salon. And he has a bestseller, All the President's Spin. Back in 2004 one of the ten best political books of the year. Recently he put out a report called "Countering Misinformation: Tips for Journalists" which I suspect may have something to do with what he's about to talk about. Thank you.

Brendan Nyhan: That’s right. Thank you very much. So in my past life I was a fact-checker. So if you remem­ber the com­mer­cial from the Hair Club for Men, I’m not only the pres­i­dent, I’m also a client,” well, I’m not only an aca­d­e­m­ic, I used to do this. So I’ve expe­ri­enced first-hand the chal­lenges of try­ing to cor­rect mis­in­for­ma­tion, and in part my aca­d­e­m­ic research builds on that expe­ri­ence and tries under­stand why it was that so much of what we did at Spinsanity antag­o­nized even those peo­ple who were inter­est­ed enough to go to a fact-checking web site. So it was a very select group of peo­ple.

And at first it’s a very puz­zling phe­nom­e­non that we antag­o­nized half of our read­ers every day. And I know the oth­er fact-checkers in the room will under­stand that sit­u­a­tion. But if you think about moti­vat­ed rea­son­ing from the per­spec­tive Chris has described, it’s pre­cise­ly those peo­ple who are best able to defend their pre-existing views and who have strong views who are most like­ly to go to a site like that in the first place. And that’s what made it so dif­fi­cult.

So, I’m very proud of the work that we did at Spinsanity. This was a non-partisan fact-checker that we ran from 2001 to 2004, sort of a pre­cur­sor to factcheck​.org and PolitiFact—more sort of insti­tu­tion­al­ized fact-checkers. This was two friends and me doing this in our spare time. And we even­tu­al­ly wrote a book and then decid­ed that the mod­el was unsus­tain­able and we shut it down.

A man standing before several people at a meeting table, staying, "On second thought, DON'T correct me if I'm wrong."

But it was a fan­tas­tic expe­ri­ence, and what it made me think about was why it’s so dif­fi­cult to get peo­ple to change their minds. And I think Chris has done a great job lay­ing out all the rea­sons that peo­ple don’t want to be told that they’re wrong. And let me just add to that that it’s very threat­en­ing to be told that you’re wrong. There’s a cog­ni­tive ele­ment to this and a polit­i­cal ele­ment to this, but there’s also a self-image or self-concept aspect to this that my coau­thor Jason Reifler and I have explored in our research. But I just want to under­score how threat­en­ing it can be to be told that you’re wrong about some­thing. And that the defen­sive reac­tions that threat can gen­er­ate are part of what make cor­rect­ing mis­per­cep­tion so dif­fi­cult.

So what I want to do is think about what are dif­fer­ent approach­es to try­ing to cor­rect mis­per­cep­tion. And the obvi­ous place to start and a place that I have myself called for in my writ­ing is for the media to be more aggres­sive in try­ing to fact-check mis­per­cep­tions. This is a con­fer­ence about truthi­ness in online media, but the main­stream media is still a very potent source of infor­ma­tion, very impor­tant in the polit­i­cal cul­ture of this coun­try. So what if the media were more aggres­sive in try­ing to counter mis­in­for­ma­tion?

False Death Panel’ Rumor Has Some Familiar Roots, Jim Rutenberg and Jackie Calmes, The New York Times, August 13, 2009

This is an exam­ple, the death pan­el sto­ry in 2009. The press was more aggres­sive in fact-checking that claim than say, they were in the run-up to the Iraq War. So, is that like­ly to be effec­tive? And what my coau­thor and I did was we actu­al­ly looked at this exper­i­men­tal­ly. We took under­grad­u­ates and we gave them mock news arti­cles where we could exper­i­men­tal­ly manip­u­late whether or not they saw the cor­rec­tive infor­ma­tion. So we have pre­cise con­trol of what they’re see­ing and we can mea­sure exact­ly what their reac­tion is to it. And the ques­tion is what hap­pens when we give them this cor­rec­tive infor­ma­tion. Is this effec­tive at get­ting them to change their minds about the giv­en fac­tu­al belief.

And the answer is gen­er­al­ly no. So, the pat­tern across the stud­ies we’ve con­duct­ed is that there’s fre­quent­ly a resis­tance to cor­rec­tions on the part of the group that’s most like­ly to want to dis­be­lieve that cor­rec­tion. This is some­thing we call dis­con­fir­ma­tion bias, and it’s very con­sis­tent with the sto­ry that that Chris described to you a moment ago.

This is a claim was made by John Kerry and John Edwards in 2004. They made state­ments sug­gest­ing that President Bush had banned all stem cell research in this coun­try. That’s not true. He lim­it­ed fed­er­al fund­ing to pre­ex­ist­ing stem cell lines. But the lan­guage that was used implied that there was a com­plete ban on stem cell research.

So we exposed sub­jects to a mock news arti­cle about this claim, gave them a cor­rec­tion. Well, who’s like­ly to want to believe this claim? Liberals who don’t like President Bush. And what you’ll see is that our cor­rec­tion was not effec­tive at reduc­ing their report­ed lev­els of mis­per­cep­tions. Conservatives were quite hap­py to hear that President Bush hadn’t done this and to go along with it. Liberals on the oth­er hand didn’t move. So the cor­rec­tion isn’t work­ing.

So while that might be trou­bling enough, it gets worse. In some cas­es, cor­rec­tions don’t just fail to change people’s minds, they make the mis­per­cep­tions worse. And this is what we call the back­fire effect. We doc­u­ment two of these in our arti­cle, which I’d encour­age you to read. Here we’re talk­ing about the claim that President Bush’s tax cuts increase gov­ern­ment rev­enue, a claim that even his own econ­o­mists dis­agree with. Virtually no expert sup­port for this claim.

Again, lib­er­als, per­fect­ly hap­py to go along with a cor­rec­tion of that state­ment. Conservatives dou­ble down, in exact­ly the way that Chris describes, becom­ing vast­ly more like­ly to say that President Bush’s tax cuts increase rev­enue rather than less. So in our efforts to com­bat mis­in­for­ma­tion, if we’re not care­ful we can actu­al­ly make the prob­lem worse. And this is some­thing that every­one should think about in this room when they’re think­ing about how to address mis­in­for­ma­tion. The Hippocratic Oath of mis­in­for­ma­tion. Try not to do harm. Because you can.

Let me just add anoth­er point here. There are also peo­ple out there who mean well and are not moti­vat­ed rea­son­ers in the way way that Chris and I have dis­cussed. And cor­rect­ing mis­per­cep­tions can still make them more mis­in­formed, too. And one mech­a­nism for this is what’s called an illu­sion of truth effect. So this is from a CDC fli­er of facts and myths about the flu. This is not some­thing that peo­ple have strong pre­ex­ist­ing beliefs about in the same sense as their pol­i­tics, right. Most peo­ple are not epi­demi­ol­o­gists and experts in the char­ac­ter­is­tics of the flu. So we tell them some things and we say, These ones are true and these ones are false and here’s why.”

But when some psy­chol­o­gists showed peo­ple this and then divid­ed the ones who saw this— Now, some of them only got the true state­ments, some got the true and the false. And then they looked at how well they retained this infor­ma­tion. What they did was they divid­ed those folks and gave a thirty-minute delay for some peo­ple. And after just thir­ty min­utes, a sig­nif­i­cant per­cent­age of peo­ple start to mis­re­mem­ber the false state­ments as true. This is a well-documented phe­nom­e­non in psy­chol­o­gy where famil­iar claims start to seem true over time. So again, in try­ing to cor­rect the mis­per­cep­tion, you’ve made the claim more famil­iar, and that famil­iar­i­ty makes peo­ple more like­ly to mis­re­port these state­ments as true.

So again, these are folks who have no par­tic­u­lar dog in this fight. So again we have to be very cau­tious about the steps we take. And again let me just under­score that the rea­son we can tell that these effects are hap­pen­ing is because we’re test­ing them exper­i­men­tal­ly. That gives us full con­trol and abil­i­ty to dis­en­tan­gle exact­ly what’s going on, which is very very dif­fi­cult oth­er­wise.

A final note about cor­rect­ing the prob­lem. The oth­er thing I would just say is while we can talk about fan­tas­tic tools we could devel­op to help cor­rect mis­in­for­ma­tion, the prob­lem we have is that the folks who seek those tools out may be those whom we’re least inter­est­ed in reach­ing, because they may already believe what we want them to believe in any par­tic­u­lar case. And even for them when we do chal­lenge them, coun­ter­at­ti­tu­di­nal infor­ma­tion is only a click away.

This is a snap­shot of the results I got when I typed Obama birth cer­tifi­cate” into Google this morn­ing. Let’s say I want to believe Obama’s birth cer­tifi­cate is real. Well, I can click on the Snopes debunk­ing, but it’s sur­round­ed by a head­line that says it could be a forgery. And news about Sheriff Joe Arpaio’s press con­fer­ence. So, choose your own adven­ture. Which one do you want? It’s very easy to seek out sup­port­ing infor­ma­tion for what­ev­er point of view you want to defend. So when we do chal­lenge peo­ple, the tech­nol­o­gy that we have makes it eas­i­er and eas­i­er to reach out and but­tress that view that’s been chal­lenged. So again this is a very dif­fi­cult chal­lenge. It’s much eas­i­er to but­tress those view now than ever before.

Now that I’ve depressed you, let me talk about things we can do that are per­haps bet­ter approach­es. And let me just say that these are part of a New America Foundation report [sum­ma­ry] that my coau­thor Jason Reifler and I wrote with help of sev­er­al peo­ple in this room—it’s avail­able on the table out there. And there’s a cou­ple of oth­er reports that are part of that pack­age about the fact-checking move­ment. But these are series of rec­om­men­da­tions that we’ve devel­oped based on avail­able research in psy­chol­o­gy and polit­i­cal sci­ence and com­mu­ni­ca­tion, on how you can com­mu­ni­cate in a more effec­tive man­ner that’s less like­ly to rein­force mis­per­cep­tion.

The first thing. This is what not to do, okay. Remember I told you about that illu­sion of truth effect where see­ing the false claim and it becom­ing famil­iar to you makes you more like­ly to mis­re­mem­ber it as true. This Politico arti­cle in its sixth or sev­enth para­graph even­tu­al­ly gets around to say­ing actu­al­ly, there’s extreme­ly strong evi­dence that Obama’s a cit­i­zen and this is all non­sense.” But if you look at the top of the arti­cle, which is what’s excerpt­ed here, what it’s done is it’s restat­ed that claim both in the head­line and the lead state­ment. And by restat­ing these claims again and mak­ing them more and more famil­iar, we’re actu­al­ly like­ly to cre­ate this flu­en­cy that caus­es peo­ple to mis­re­mem­ber these state­ments as true. So when I say best prac­tices here, this is what not to do. And I have an arti­cle on the Columbia Journalism Review blog about this if you’re inter­est­ed in read­ing more about this prob­lem.

Another prob­lem, nega­tions. Again, there are well-intentioned peo­ple who are con­fused some­times. We’ve often been talk­ing about moti­vat­ed rea­son­ing and peo­ple who don’t want to be con­vinced. But even those peo­ple who are open-minded can have a tough time with nega­tions. When we try to say some­thing is not true, we may end up rein­forc­ing that claim we’re try­ing to inval­i­date.

So this is an exam­ple of some well-intentioned folks try­ing to debunk, to dis­cred­it a claim that MMR caus­es autism. The prob­lem is they have MMR…cause autism” in the head­line. You stare at that long enough and peo­ple will start to get ner­vous. And my coau­thors and I have have done an exper­i­men­tal study find­ing sim­i­lar effects, that try­ing to cor­rect the MMR/autism asso­ci­a­tion can actu­al­ly make peo­ple less like­ly to vac­ci­nate rather more.

A third rec­om­men­da­tion. And this is real­ly for the jour­nal­ists in the room. But the notion of of arti­fi­cial bal­ance, in which each side has to be equal­ly rep­re­sent­ed in fac­tu­al debates, has been shown to mis­lead peo­ple quite a bit. So Jon Krosnick in Stanford and his col­leagues have a study show­ing that pro­vid­ing a bal­anced report in the sense of one sci­en­tist who says glob­al warm­ing will have destruc­tive con­se­quences and one who says it’s great—the planet’s nice and warm, (which is this guy’s mes­sage)… Providing that bal­anced point of view is dra­mat­i­cal­ly changes how peo­ple inter­pret the sci­en­tif­ic evi­dence. And I don’t want to pick on this par­tic­u­lar debate, but just to say that report­ing needs to reflect the bal­ance of the evi­dence. And the he said–she said per­spec­tive that Kathleen men­tioned ear­li­er that leads to this sort of quo­ta­tion of fringe sources to pro­vide bal­ance can real­ly mis­lead peo­ple. And that’s some­thing to be avoid­ed when­ev­er pos­si­ble.

Another rec­om­men­da­tion. Graphics. People are real­ly good at argu­ing against tex­tu­al infor­ma­tion. At least in the exper­i­ments we’ve con­duct­ed, graph­ics seem to be more effec­tive for those quan­ti­ties that are… Let me say that a dif­fer­ent way. For mis­per­cep­tions that can be graphed, right. There’s lots of mis­per­cep­tions we can’t graph. I don’t have a graph of Iraq’s weapons of mass destruc­tion. I can’t put one up for you. But what I can do is show you that four dif­fer­ent inde­pen­dent sets of tem­per­a­ture data show dra­mat­i­cal­ly increas­ing glob­al tem­per­a­tures and are extreme­ly highly-correlated. This is from a NASA press release. We found this was quite effec­tive, much more effec­tive than equiv­a­lent tex­tu­al cor­rec­tion, at chang­ing people’s beliefs about glob­al warm­ing.

Experts Debunk Health Care Reform Bill’s Death Panel’ Rule, Kate Snow, John Gever, Dan Childs, ABC News, August 11, 2009

Another approach, and this is some­thing I don’t think we have talked much about so far. But it’s impor­tant think about cred­i­ble sources to peo­ple who don’t want to be con­vinced. This is an exam­ple of what I thought was an exem­plary report on the death pan­el con­tro­ver­sy from ABC news that said— It’s framed as experts, right, doc­tors agree that death pan­els aren’t true. So it’s going to med­ical exper­tise, it’s get­ting out of the polit­i­cal domain, and it’s say­ing that even experts who do not sup­port the ver­sion of the health­care reform bill being pro­posed by President Obama agree that death pan­els aren’t in the bill. And it goes on to quote a Republican econ­o­mist with impec­ca­ble cre­den­tials on that side of the aisle say­ing there’s lots to oppose about this bill but death pan­els isn’t one of them. And to the extent that we can reach out and find those more cred­i­ble sources to peo­ple who aren’t will­ing to lis­ten to the nor­mal peo­ple who are typ­i­cal­ly offered to them, that may be an espe­cial­ly effec­tive approach.

So for more I’d com­mend to you the report that I men­tioned ear­li­er as well as those by my col­leagues about the fact-checking move­ment. Thanks very much.

Help Support Open Transcripts

If you found this useful or interesting, please consider supporting the project monthly at Patreon or once via Square Cash, or even just sharing the link. Thanks.