Chris Mooney: It’s great to be with you to dis­cuss the sci­ence of why we deny sci­ence and real­i­ty. In oth­er words what is it about our brains that makes facts so chal­leng­ing, so odd and threat­en­ing? Why do we some­times dou­ble down on false beliefs? And maybe why do some of us do it more than oth­ers? That’s the next book. I’m not going to talk about it that much. I’m going to try to be some­what down the mid­dle here, although it’s dif­fi­cult.

But I’ve been writ­ing about polit­i­cal and sci­en­tif­ic mis­in­for­ma­tion for a decade and I’m going to con­fess. I got a lot of the big pic­ture wrong ini­tial­ly. The first book was called The Republican War on Science and we did not notice at the time the echo visu­al­ly with anoth­er book that was out. And it was all about peo­ple deny­ing the sci­ence of glob­al warm­ing. You know, deny­ing evo­lu­tion. Getting it wrong on stem cells.

What was wrong about my analy­sis was that I was wed­ded to an old Enlightenment view of ratio­nal­i­ty. And what I mean is that I had this vision you know, if you put good infor­ma­tion out there and you use ratio­nal argu­ments, peo­ple will come to accept what’s true. Especially if you edu­cate them, in places like this. Teach them crit­i­cal think­ing skills. And a lot of peo­ple believe or want to believe that this is true.

The prob­lem is that rather awk­ward­ly, there is a sci­ence of why we deny sci­ence. There are facts about why we deny facts. There’s a sci­ence of truthi­ness. I was going to actu­al­ly title the book that, but The Republican Brain is bet­ter. But there’s a sci­ence of truthi­ness. And the upshot is that para­dox­i­cal­ly, iron­i­cal­ly, the Enlightenment view doesn’t describe people’s think­ing process­es so if we’re actu­al­ly enlight­ened, we have to reject the Enlightenment view. And I want to tell you first how I real­ized that.

The key moment came in the year 2008 when I stum­bled upon some­thing that I call the Smart Idiot” Effect. What is a smart idiot? This was data from Pew, and it was a poll on glob­al warm­ing. And it was show­ing the rela­tion­ship between polit­i­cal par­ty affil­i­a­tion, lev­el of edu­ca­tion, and belief that glob­al warm­ing is caused by human beings.

And I don’t know if you can see it very well because of the col­or con­trast but what it shows that if you’re a Republican, the high­er you lev­el of edu­ca­tion then the less like­ly you are to believe in sci­en­tif­ic real­i­ty. So these are the col­lege grads, these are the non-college grads, and you’ve got less belief among the col­lege grad Republicans in what’s true than you do among the non-col­lege col­lege grad Republicans. Whereas Democrats and inde­pen­dents, the rela­tion­ship between edu­ca­tion and believ­ing in real­i­ty is the opposite—they believe it more. So these peo­ple, the 81% that don’t believe it, those are smart idiots.

And I’m try­ing not to be par­ti­san, so I will show you lib­er­al smart idiots. You could call this dumb and dumb­er.

Now, the peo­ple who deny the sci­ence of vac­cines, which is that they do not cause autism, it turns out that the New England Journal of Medicine stud­ied who these peo­ple are and they tend to be white, well-to-do, and the moth­er has a col­lege edu­ca­tion. And they tend to go online, what Jenny McCarthy called the uni­ver­si­ty of Google,” and inform them­selves about this. And so by empow­er­ing them­selves they end up more wrong, and end up believ­ing that vac­cines are dan­ger­ous.

Education and intel­li­gence there­fore do not guar­an­tee sound ratio­nal deci­sions, nor do they ensure that peo­ple accept sci­ence or facts. And I want to give you an even cra­zier, more fun exam­ple of the smart idiot effect.

So this is from the polit­i­cal sci­en­tist John Sides of George Washington. He unpacked the data on why belief that pres­i­dent Obama is a Muslim increased—or in whom it increased—between 2009 and 2010. So here’s where the increase is, and again we’ve got a high­er slope for Republicans with some col­lege or col­lege grad, than those who have less edu­ca­tion. So again, smart idiot effect. And this is pret­ty frus­trat­ing. You’re like why they do this? How could this pos­si­bly be?

It seems that the more capa­ble you are of com­ing up with argu­ments that sup­port your beliefs, the more you’ll think you’re right. And the less like­ly you will be to change. And if this is true we have a pret­ty big prob­lem. It would explain a lot of the polar­iza­tion in America.

The good news is that there is a way of under­stand­ing why we do this, and it’s quite relat­able. The sci­ence behind it is pret­ty easy to under­stand because we all know from our per­son­al lives, we all know from our rela­tion­ships, how hard it can be to get some­one we care about to change their minds. And we know from great works of lit­er­a­ture like Great Expectations. I actu­al­ly have seen the black and white Great Expectations.

You know, there’s so many famous char­ac­ters who fall for self-delusion, who fall for ratio­nal­iza­tion, Pip being one of them. He wants to believe that he’s des­tined to mar­ry Estella. He wants to believe that Miss Havisham is not this manip­u­la­tive old crone, she actu­al­ly has his best inter­ests at heart. He believes this so strong­ly that he essen­tial­ly ruined his life. That’s the sto­ry of Great Expectations.

What Dickens grasped about peo­ple in a lit­er­ary way we are now con­firm­ing based on psy­chol­o­gy and even neu­ro­science. And what this leads to is a the­o­ry called moti­vat­ed rea­son­ing.” And what it demon­strates is that we don’t think very dif­fer­ent­ly about pol­i­tics than we do about emo­tion­al mat­ters in our per­son­al lives, at least if we have a strong invest­ment or com­mit­ment.

To under­stand how moti­vat­ed rea­son­ing works, you need to under­stand three core points about how the brain works. The first of them is that the vast major­i­ty of what all of our brains are doing we are not aware of. The con­scious part, the us, the self—just a small per­cent­age of what it’s up to.

The sec­ond one is that among the things it’s up to that we’re not aware of is the brain is hav­ing rapid-fire emo­tion­al or affec­tive” respons­es to ideas, images, stim­uli, and it’s doing it’s real­ly fast, okay. So fast that we don’t even know it’s doing this.

And then the third point is that these emo­tion­al respons­es guide the process of retriev­ing mem­o­ries into consciousness—thoughts, asso­ci­a­tions that we have with the stim­uli. Emotion is the way that those things are pulled up for us to think about.

And so what this means is that we might actu­al­ly think we’re rea­son­ing log­i­cal­ly. We might think we’re act­ing like sci­en­tists. But in fact we’re act­ing like these guys. And I love being at Harvard Law School putting up lawyers. When we actu­al­ly become con­scious of rea­son­ing, we’ve already been sent down a path, by emo­tion, retriev­ing from mem­o­ry the argu­ments that we’ve always made before that sup­port what we already think. So we’re not real­ly rea­son­ing, we’re ratio­nal­iz­ing. We are defend­ing our case. And our case is not just a part of who we are, it’s a phys­i­cal part of our brains.

And this explains so many things about why rea­son­ing goes awry. It explains goal­post shift­ing, for instance. So you know, for years the birthers want­ed the birth cer­tifi­cate. The long-form birth cer­tifi­cate. Last year they got it. So did they stop being birthers? No. Did they change their minds? No. The new sci­ence of why we deny sci­ence can pre­dict that they would dou­ble down on their wrong belief and they would come up with new rea­sons to dis­trust the new evi­dence that they’ve been giv­en.

And the same goes for all all man­ner of log­i­cal errors, fal­lac­i­es, hypocrisies. The answer used to be hey, just learn crit­i­cal think­ing and you’ll avoid these. And that might be true to an extent but it seems like rea­son­ing is designed to help us see these prob­lems in oth­ers way bet­ter than to see them in our­selves. And it’s designed this way. It man­i­fests first when we’re quite young.

I want to tell you about a real­ly fun­ny moti­vat­ed rea­son­ing study involv­ing the bias­es of ado­les­cence. You might remem­ber that in the 1980s there was this big bat­tle about label­ing rock albums. And then there was this fear that music cor­rupt­ed kids, led them to Satanism, what have you. And one side in the debate was the PMRC and Tipper Gore. And here she’s got an album, the title is Be My Slave. I don’t know if you can see the whips and chains. But this was dom­i­na­trix met­al so. Some moms didn’t think lit­tle Johnny should get this record. And on the oth­er side there was Frank Zappa, here pic­tured in the Phi Zappa Krappa” picture—that’s so awe­some. And as he tes­ti­fied before Congress, I love this quote, The PMRC’s demands are the equiv­a­lent of treat­ing dan­druff by decap­i­ta­tion.” Such was the debate, okay.

So in comes a psy­chol­o­gist named Paul Klaczynski and he does a moti­vat­ed rea­son­ing study dove­tail­ing with it all. He took ninth and twelfth graders who were either fans of coun­try music or fans of heavy met­al and he asked them to read argu­ments that he had con­struct­ed. All of the argu­ments con­tain the log­i­cal fal­la­cy. All the argu­ments were about the effects of lis­ten­ing to a par­tic­u­lar kind of music upon your behav­ior. So here’s an exam­ple of a fal­la­cious argu­ment they might have got­ten, some­thing like this: Little Johnny lis­tened to Ozzy Osbourne, then he killed him­self. Heavy met­al is dan­ger­ous stuff. Okay, every­one see the fal­la­cy there?

And so you would get the same kind of flawed argu­ments about coun­try music. So what hap­pens? Adolescent coun­try fans can see log­i­cal fal­lac­i­es when they are in an argu­ment sug­gest­ing that lis­ten­ing to coun­try leads to bad out­comes, bad behav­iors. But they do not detect the fal­lac­i­es in pro-coun­try argu­ments. The same thing for heavy met­al fans. They’re all biased, and it doesn’t get bet­ter between ninth and twelfth grade. More edu­ca­tion doesn’t some­how let you see what’s wrong with your point of view.

And it is not just prob­lems or fal­lac­i­es of basic rea­son­ing that moti­vat­ed rea­son­ing explains, it explains for all the sci­ence debates that I write about. The my exper­tise bet­ter than your expert” prob­lem gets straight to the heart of why some peo­ple deny sci­ence. They think their sci­ence is bet­ter.

So this is from Yale’s Dan Kahan. He’s also here at the Harvard Law School some­times, I think. And he divides peo­ples moral val­ues up along two axes. What’s impor­tant about this is your moral val­ues push you emo­tion­al­ly to believe things, and so you start tag­ging things emo­tion­al­ly based upon what your val­ues are. So we’re either hier­ar­chi­cal or egal­i­tar­i­an, that’s up and down. We’re either indi­vid­u­al­is­tic or com­mu­ni­tar­i­an.

I think peo­ple basi­cal­ly know what these words mean. But in case you don’t, Republicans live up here, Democrats live down here. Republicans tend to be hier­ar­chi­cal and indi­vid­u­al­is­tic, sup­port­ing more inequal­i­ty and sup­port­ing keep­ing gov­ern­ment out of life. And Democrats more com­mu­ni­tar­i­an, sup­port­ing the group—it takes a village—and egal­i­tar­i­an, sup­port­ing equal­i­ty.

Okay, the point is that this deter­mines who you think is a sci­en­tif­ic expert. Because then he shows peo­ple a fake sci­en­tist” who either sup­ports or doesn’t sup­port the con­sen­sus that glob­al warm­ing is caused by humans. And if you’re up in the quad­rant where the Republicans live, then only 23% agree that the sci­en­tist is a trust­wor­thy and knowl­edge­able expert if the sci­en­tist says that glob­al warm­ing is caused by humans. But down here, 88% agree that the sci­en­tist is an expert.

So in oth­er words, sci­ence is for many of us is what­ev­er we want it to be, and we’ve been impelled auto­mat­i­cal­ly, emo­tion­al­ly, into hav­ing that kind of reac­tion to who an expert actu­al­ly is.

This is bad enough, but the dou­ble wham­my is always going to come when you com­bine these flawed rea­son­ing process­es with selec­tive infor­ma­tion chan­nels (which brings us to the Internet) of the sort that the Internet makes so pos­si­ble. But this is only a dif­fer­ence of degree. It’s not a dif­fer­ence of kind, because we already see what hap­pens with the selec­tive infor­ma­tion chan­nel that is Fox News.

So, lots of research show­ing that Fox News view­ers believe more wrong things. They believe wrong things about glob­al warm­ing, wrong things about health­care, wrong things about Iraq, and on and on and on. And we can doc­u­ment that they believe them more than peo­ple who watch oth­er chan­nels.

Why does this occur? Well, they’re con­sum­ing the infor­ma­tion, it res­onates with their val­ues and then they’re think­ing it through and argu­ing it, get­ting fired up—motivated reasoning—and then they run out and they rein­force the beliefs. But also, and this is where I’ll get into some pos­si­ble left/right dif­fer­ences. There’s some evi­dence to sug­gest that con­ser­v­a­tives might have more of a ten­den­cy to want select into belief chan­nels that sup­port what they believe to begin with. Right wing author­i­tar­i­an are a group that’ve been much stud­ied and they are part of the con­ser­v­a­tive base, and there are some stud­ies show­ing that they engage in more selec­tive expo­sure, try­ing to find infor­ma­tion that sup­ports beliefs.

There was a study by Shanto Iyengar at Stanford of Republicans and Democrats con­sum­ing media. And what he found was that Democrats def­i­nite­ly didn’t like Fox, but they spread their inter­ests across a vari­ety of oth­er sources. But Republicans were all Fox, almost, and as they wrote, The prob­a­bil­i­ty that a Republican would select a CNN or NPR report was about 10%” in this study. So the Fox effect, believ­ing wrong things, is prob­a­bly both moti­vat­ed rea­son­ing and also peo­ple select­ing in to the infor­ma­tion stream to begin with.

So how do you short-circuit moti­vat­ed rea­son­ing? That’s what Brendan Nyhan is going to talk about. I’m not going to talk about it, but I will say you don’t argue the facts. There are approach­es that are shown by evi­dence to work. For my part let me just end with some words of eter­nal wis­dom from a tru­ly pro­found philoso­pher, Yoda. Yoda said you must unlearn what you have learned. And when it comes to the denial of sci­ence and facts, that is pre­cise­ly the predica­ment we’re in. So thanks.

Further Reference

Inside the Political Brain, Chris Mooney on the Klaczynski study

Truthiness in Digital Media event site

Help Support Open Transcripts

If you found this useful or interesting, please consider supporting the project monthly at Patreon or once via Square Cash, or even just sharing the link. Thanks.