Tali Sharot: By the end of today, 4 mil­lion blogs will be post­ed, 80 mil­lion Instagram pho­tos uploaded, and 600 mil­lion tweets released into cyber­space. That’s more than 7,000 tweets per sec­ond.

Why do you spend pre­cious moments every day shar­ing infor­ma­tion? There’s prob­a­bly many rea­sons, but it appears that the oppor­tu­ni­ty to impart your knowl­edge onto oth­ers is inter­nal­ly reward­ing. A study con­duct­ed at Harvard showed that peo­ple were will­ing to forego pay­ment in order to have their opin­ions broad­cast to oth­ers. Now, we’re not talk­ing well‐crafted insights here. These were people’s opin­ions about whether Barack Obama enjoys win­ter sports, or whether cof­fee is bet­ter than tea.

A brain imag­ing study showed that when peo­ple had an oppor­tu­ni­ty to share these pearls of wis­dom with oth­ers, their reward cen­ter in the brain was very strong­ly acti­vat­ed. You feel a burst of plea­sure when you share your thoughts, and that dri­ves you to com­mu­ni­cate. It’s a nifty fea­ture of our brain because it ensures that ideas are not buried with the per­son who first had them, and as a soci­ety we can ben­e­fit from the minds of many.

But for that to hap­pen shar­ing is not enough. We need to cause a reac­tion in oth­ers. What then deter­mines whether you affect the way peo­ple behave and think, or whether you’re ignored?

So, as a sci­en­tist I used to think that the answer was data. Get data, cou­ple with log­i­cal think­ing, that’s bound to change minds, right? So I went out to try and get said data. My col­leagues and I con­duct­ed dozens of exper­i­ments to try and fig­ure out what caus­es peo­ple to change their deci­sions, to update their beliefs, to rewrite their mem­o­ries? We peeked into people’s brains, we record­ed bod­i­ly respons­es, and we observed behav­ior.

So you can imag­ine my dis­may when all of these exper­i­ments point­ed to the fact that peo­ple are not in fact dri­ven by facts. People do adore data. But facts and fig­ures often fail to change beliefs and behav­ior. The prob­lem with an approach that pri­or­i­tizes infor­ma­tion is that it ignores what makes us human. Our desires, our fears, our emo­tions, our pri­or beliefs, our hope.

Let me give you an exam­ple: cli­mate change. My col­leagues Cass Sunstein, Sebastian Bobadilla Suarez, Stephanie Lazzaro, and I want­ed to know whether we could change the way peo­ple think about cli­mate change with sci­ence. So first of all we asked all the vol­un­teers did they believe in man‐made cli­mate change? Did they sup­port the Paris Agreement? And based on their answers we divid­ed them into the strong believ­ers and the weak believ­ers. We then told every­one that experts esti­mat­ed that by 2100 the tem­per­a­ture would rise by six degrees, and please give us your own esti­mate. So, the weak believ­ers gave an esti­mate that was low­er than the strong believ­ers.

Then came the real test. We told half of all the par­tic­i­pants that the experts have reassessed their data, and now con­clude that things are much much bet­ter than pre­vi­ous­ly thought and the tem­per­a­ture would only rise by one to five degrees. We told the oth­er half of par­tic­i­pants that the experts have reassessed their data, and now con­clud­ed the things are much much worse than pre­vi­ous­ly thought, and the tem­per­a­ture would rise by sev­en to even degrees, and please give us your own esti­mate.

The ques­tion was, would peo­ple take this infor­ma­tion to change their beliefs? Indeed they did. But most­ly when the infor­ma­tion fit their pre­con­ceived notions. So when the weak believ­ers heard that the experts are say­ing that actu­al­ly things are not as as bad as pre­vi­ous­ly thought, they were quick to change their esti­mate in that direc­tion. But they didn’t budge when they learned that the experts are say­ing that actu­al­ly things are much worse than pre­vi­ous­ly pre­dict­ed.

The strong believ­ers showed the oppo­site pat­tern. So when they heard that the experts are say­ing that things are much more dire, they changed their esti­mate in that direc­tion. But they didn’t move that much when they learned that the experts are say­ing that things are not that bad.

When you give peo­ple infor­ma­tion, they are quick to adopt data that con­firms their pre‐notions, but often will look at coun­terev­i­dence with a crit­i­cal eye. This will cause polar­iza­tion, which will expand and expand as peo­ple get more and more infor­ma­tion.

What goes on inside our brain when we encounter dis­con­form­ing opin­ions? Andreas Kappes, Read Montague, and I, invit­ed vol­un­teers into the lab in Paris. And we simul­ta­ne­ous­ly scanned their brains in two MRI machines while they were mak­ing deci­sions about real estate and com­mu­ni­cat­ing those assess­ments to one anoth­er.

What we found was that when the pair agreed about a real estate, each person’s brain close­ly tracked the opin­ion of the oth­er, and every­one became more con­fi­dent. When the pair disagreed, the oth­er per­son was sim­ply ignored, and the brain failed to encode the nuances of that eval­u­a­tion. In oth­er words opin­ions are tak­en to heart and close­ly encod­ed by the brain most­ly when it fits our own.

Is that true for all brains? Well, if you see your­self as high­ly ana­lyt­i­cal, brace your­self. People who have bet­ter quan­ti­ta­tive skills seem to be more like­ly to twist data at will. In one study, 1,000 vol­un­teers were giv­en two data sets: one look­ing at skin treat­ment, the oth­er at gun con­trol laws. They were asked to look at the data and con­clude: Is a skin treat­ment reduc­ing skin rash­es? Are the gun laws reduc­ing crime?

What they found was that peo­ple with bet­ter math skills did a bet­ter job at ana­lyz­ing the skin treat­ment data than the peo­ple with worse math skills. No sur­prise here. However, here’s the inter­est­ing part. The peo­ple with bet­ter math skills? They did worse at ana­lyz­ing the gun con­trol data. It seems that peo­ple were using their intel­li­gence not nec­es­sar­i­ly to reach more accu­rate con­clu­sions but rather to find fault with data that we’re unhap­py with.

The ques­tion then becomes why have we evolved a brain that is hap­py to dis­re­gard per­fect­ly good infor­ma­tion when it doesn’t fit our own? This seems like poten­tial­ly bad engi­neer­ing, leav­ing errors in judg­ment. So why hasn’t this glitch been cor­rect­ed for over the course of evo­lu­tion?

Well, the brain assess­es a piece of data in light of the infor­ma­tion it already stores, because on aver­age that is in fact the cor­rect approach. For exam­ple, if I were to tell you that I saw pink ele­phants fly­ing in the sky, you would con­clude that I’m delu­sion­al or lying, as you should. When a piece of data doesn’t fit a belief that we hold strong­ly, that piece of data, on aver­age, is in fact wrong. However, if I were to tell my young daugh­ter that I saw pink ele­phants fly­ing in the sky, most like­ly she would believe me, because she has yet to form strong beliefs about the world.

There are four fac­tors that deter­mine whether a piece of evi­dence will alter your belief: your cur­rent belief; your con­fi­dence in that cur­rent belief; the new piece of evi­dence; and your con­fi­dence in that piece of evi­dence. And the fur­ther away the piece of evi­dence is from your cur­rent belief, the less like­ly it is to change it. This is not an irra­tional way to change beliefs, but it does mean that strongly‐held false beliefs are very hard to change.

There is one excep­tion, though: when the coun­terev­i­dence is exact­ly what you want to hear. For exam­ple, when peo­ple are told that oth­ers see them as much more attrac­tive than they see them­selves, they’re hap­py to change their self‐perception. Or if you learn that your genes sug­gest that you’re much more resis­tant to dis­ease than you thought, you’re quick to change your beliefs.

What about pol­i­tics? Back in August, 900 American cit­i­zens were asked to pre­dict the results of the pres­i­den­tial elec­tion by putting a lit­tle arrow on a scale that went from Clinton to Trump. So if you thought Clinton was high­ly like­ly to win you put the arrow right next to Clinton. If you thought it’s a 50‍/‍50, you put it in the mid­dle. And so on and so forth. They were also asked, Who do you want to win?”

So, half of the vol­un­teers want­ed Trump to win, and half want­ed Clinton to win. But back in August, the major­i­ty of both the Trump sup­port­ers and the Clinton sup­port­ers believed that Clinton was going to win.

Then a new poll was intro­duced, pre­dict­ing a Trump vic­to­ry, and every­one was asked again, Who do you think is going to win?” Did the new poll change their pre­dic­tions? Indeed it did. But most­ly it changed the pre­dic­tions of the Trump sup­port­ers. They were elat­ed to hear that the new poll was sug­gest­ing a Trump vic­to­ry and were quick to change their pre­dic­tions. The Clinton sup­port­ers didn’t change the pre­dic­tions as much, and many of them ignored the new poll alto­geth­er.

The ques­tion then is how do we change beliefs? I mean, sure­ly opin­ions do not remain sta­ble, they do involve. So what can we do to facil­i­tate change? The secret is to go along with how our brain works, not against it.

So, the brain tries to assess any piece of evi­dence in light of the knowl­edge it already stores. And when that piece of evi­dence doesn’t fit, it’s either ignored or sub­stan­tial­ly altered. Unless of course it’s exact­ly what you want to hear. So, per­haps instead of try­ing to break an exist­ing belief, we can attempts to implant an new belief alto­geth­er and high­light the pos­i­tive aspects of the infor­ma­tion that we’re offer­ing.

This all sounds very abstract, I know. Let me give you an exam­ple: vac­cines. So, par­ents who refuse to vac­ci­nate their kids because of the alleged linked to autism often are not con­vinced by sci­ence sug­gest­ing that there’s no link between the two. What to do then? A group of researchers, instead of try­ing to break that belief offered the par­ents more infor­ma­tion about the ben­e­fits of the vac­cine. True infor­ma­tion. How it actu­al­ly pre­vents kids from encoun­ter­ing dead­ly dis­ease. And it worked.

So, when try­ing to change opin­ions, we need to con­sid­er the oth­er person’s mind. What are their cur­rent beliefs? What are their moti­va­tions? When some­one has a strong motive to believe some­thing, even a hefty sack of evi­dence to the con­trary will fall on deaf ears. So we need to present the evi­dence in a way that is more con­vinc­ing to the oth­er per­son, not nec­es­sar­i­ly in the way most con­vinc­ing to us. Identify com­mon motives and then use those to implant new beliefs. Thank you.


Help Support Open Transcripts

If you found this useful or interesting, please consider supporting the project monthly at Patreon or once via Square Cash, or even just sharing the link. Thanks.