Micah Saul: This project is built on a hypoth­e­sis. There are moments in his­to­ry when the sta­tus quo fails. Political sys­tems prove insuf­fi­cient, reli­gious ideas unsat­is­fac­to­ry, social struc­tures intol­er­a­ble. These are moments of crisis. 

Aengus Anderson: During some of these moments, great minds have entered into con­ver­sa­tion and torn apart inher­it­ed ideas, dethron­ing truths, com­bin­ing old thoughts, and cre­at­ing new ideas. They’ve shaped the norms of future generations.

Saul: Every era has its issues, but do ours war­rant The Conversation? If they do, is it happening?

Anderson: We’ll be explor­ing these sorts of ques­tions through con­ver­sa­tions with a cross-section of American thinkers, peo­ple who are cri­tiquing some aspect of nor­mal­i­ty and offer­ing an alter­na­tive vision of the future. People who might be hav­ing The Conversation.

Saul: Like a real con­ver­sa­tion, this project is going to be sub­jec­tive. It will fre­quent­ly change direc­tions, con­nect unex­pect­ed ideas, and wan­der between the tan­gi­ble and the abstract. It will leave us with far more ques­tions than answers because after all, nobody has a monop­oly on dream­ing about the future.

Anderson: I’m Aengus Anderson.

Saul: And I’m Micah Saul. And you’re lis­ten­ing to The Conversation.


Aengus Anderson: Once again we’re back a lit­tle late. Micah is out for this one, but I’m here.

Neil Prendergast: And I’m here too. Hey, everybody.

Anderson: This is actu­al­ly the last con­ver­sa­tion I record­ed in the first big phase of pro­duc­tion. So this was taped in December of 2012, and then we’ll be mov­ing into the new round of inter­views, which are all San Francisco-based that I did more recent­ly next. 

This one is with Ed Finn. He’s up at Arizona State University in Tempe. And he’s the head of the Center for Science and the Imagination, CSI as an acronym, which is a very excel­lent thing. But they’re not actu­al­ly a crime-solving unit. The Center was found­ed as a result of a con­ver­sa­tion about the future between a sci­ence fic­tion author, Neal Stephenson, and ASU’s pres­i­dent, Michael Crow. 

Prendergast: And as the sto­ry goes, Stephenson was bemoan­ing the dearth of grand visions and sci­en­tif­ic projects since the 1960s, and our cul­ture’s gen­er­al slide towards dystopi­an fantasies. 

Anderson: Which is kind of awe­some, right. Because he’s one of the fathers of cyberpunk.

Prendergast: So he’s hav­ing this con­ver­sa­tion with ASU pres­i­dent Michael Crow, and in response Crow sug­gest­ed that the prob­lem lay part­ly with sci­ence fic­tion authors and oth­er thinkers who shape how peo­ple dream about the future. And this helped cre­ate, it seems, Ed’s job, which is to bring thinkers togeth­er from mul­ti­ple fields and cre­ate real­is­tic but also opti­mistic visions of the future.

Anderson: So today we’re going to be talk­ing a lot about dream­ing. And this con­ver­sa­tion is going to be…pretty abstract. You know, we’ve done a lot of ones that were more tan­gi­ble recent­ly. You know, we’ve talked about things like coop­er­a­tives. But we’re real­ly going to be talk­ing a lot about nar­ra­tive today. And we’re going to be talk­ing about dif­fer­ent social con­di­tions that lead to dif­fer­ent nar­ra­tive struc­tures or block the time we have for dreaming.

But before we do that, just one lit­tle word to add in. We’re pitch­ing a pan­el to South by Southwest, which will be in the spring of 2014. The pan­el’s called A Sheep in Wolf’s Clothes: The Myth of Disruption.” And basi­cal­ly we’re going to be talk­ing about the sort of tech futur­ism that we’ve seen so much of, where there is this kind of blend of free mar­ket cap­i­tal­ism and rep­re­sen­ta­tive democ­ra­cy, and a cer­tain type of sci­en­tism. And there’s kind of an unques­tioned faith in progress that embod­ies all of those things that’s one of the real­ly big dis­cours­es on the future. And what we want to talk about is we’ve talk to all these dif­fer­ent peo­ple in this project, and what are the oth­er visions, you know. Because we can see the kind of Silicon Valley vision being advanced in real­ly promi­nent places like the TED Conference. But you don’t always see in con­ver­sa­tion with folks like John Zerzan, or more recent­ly say, Carlos Perez de Alejo. So we want to ask, pub­licly, at South by Southwest, how do you have a con­ver­sa­tion between these things?

Prendergast: I think with Aengus is say­ing is vote for us.” 

Anderson: God, you do that so much bet­ter than I do. That’s prob­a­bly the best thing. So if you go on the South by Southwest pan­el pick­er, look for A Sheep in Wolf’s Clothes: The Myth of Disruption,” or you can just search for Micah Saul or Aengus Anderson on there, and give us a vote. And if we’re real­ly lucky Neil will be free then and he’ll be join­ing us there too. So, if you’ve enjoyed this project and you might even be at South by Southwest, you can berate us in per­son. It’ll be a great time. 

Prendergast: But for now, let’s make some space for dreaming.


Ed Finn: The Center, one of our core goals, our mis­sion state­ment, is to get peo­ple think­ing more cre­ative­ly and ambi­tious­ly about the future. What I mean when I talk about that is that we need to come up with bet­ter sto­ries about the future. If you want to build a bet­ter world you have to imag­ine that world first. So, I think that cre­ativ­i­ty is at the core of the con­ver­sa­tions we need to be hav­ing about the world we want to build, whether that’s in the con­text of sus­tain­abil­i­ty, or send­ing manned mis­sions to space, or edu­ca­tion reform, any­thing. You have to start with imag­i­na­tion. You have to start with dream­ing and the ways in which we talk about ideas and play with ideas.

So, we have col­lab­o­ra­tions that bring togeth­er sci­ence fic­tion writ­ers with sci­en­tists. We have col­lab­o­ra­tions where we’re try­ing to get stu­dents in col­lege to write real­ly ambi­tious and cre­ative things about the future. And when we do all of these things, we want to cre­ate that safe space, basi­cal­ly, where you can step out­side of your dis­ci­pli­nary bound­aries, out of your pro­fes­sion­al iden­ti­ty, and throw out a real­ly crazy idea. What we some­times call a moon­shot idea though I don’t always like the term moon­shot” because I think it over­ly spec­i­fies a cer­tain kind of ambi­tion, and I think of the cre­ativ­i­ty and the imag­i­na­tion that we are inter­est­ed in much more broad­ly. And it includes artis­tic imag­i­na­tion, it includes lit­er­ary imag­i­na­tion, it includes kids draw­ing stuff in their kinder­garten class.

So that’s what comes to mind when you men­tion dream­ing. And what I think is real­ly impor­tant about the work that the Center does is to expand that def­i­n­i­tion of dream­ing and make sure that it includes every­body, to talk about sci­en­tists as well as sci­ence fic­tion writ­ers as peo­ple who dream about the future and to rec­og­nize that inspi­ra­tion, whether you are an engi­neer or a poet, prob­a­bly starts from pret­ty much the same space. And I don’t know where exact­ly in the brain that is, but it starts from a cer­tain kind of open­ness and a will­ing­ness to shift your per­spec­tive and see the world in a dif­fer­ent way, to have an idea. And then I think what’s inter­est­ing about dreams is not just that you have this sin­gle moment, but a dream is an extend­ed nar­ra­tive. It’s a space where any­thing is pos­si­ble but you also play out some kind of a sto­ry. And so, with my back­ground in the human­i­ties I think that sto­ry­telling is a cru­cial part of this whole idea of dreaming.

Now, what’s won­der­ful about sto­ries is they help us to share a vision of the future in a way that is not over­ly spe­cif­ic or dominant—it does­n’t say this is exact­ly how it’s going to be.” A good sto­ry is a sketch of the world. And a lot of the imag­i­na­tive work is done by the peo­ple lis­ten­ing to that sto­ry or read­ing that sto­ry. Neal Stephenson, one of our col­lab­o­ra­tors with the Center, likes to say that a good sci­ence fic­tion sto­ry can save hun­dreds of PowerPoints and bor­ing meet­ings, because it just puts every­body on the same page with a big idea. You cre­ate that sense of momen­tum, that sense of char­ac­ter and human engage­ment, how an idea actu­al­ly fits into a social and a cul­tur­al land­scape. And those are the key ele­ments to get­ting peo­ple on board in start­ing to under­stand what kind of future world we could live in.

Aengus Anderson: One of the things that real­ly res­onates with me about when we talk about dream­ing is of course this project is The Conversation. And what is The Conversation about? Well, it’s a bunch of wild ideas about the future. It’s the ques­tion of how do we want to live? What is the good way to live? And dream­ing is of course…maybe that’s the nar­ra­tive side of the same thing. Do you think that we are dream­ing as a soci­ety, or as a plan­et…enough?

Finn: No, I don’t think we are. I think that we are so suc­cess­ful now at fill­ing our days with frag­ments, and minia­tures, and oth­er peo­ple’s dreams—or man­u­fac­tured dreams, that are real­ly just designed to enter­tain us, that we don’t leave enough space for deep think­ing and extend­ed rever­ies (to extend the dream­ing metaphor). And I think that’s real­ly impor­tant. I think it’s impor­tant to think about the future at every stage in human devel­op­ment. I think it’s espe­cial­ly impor­tant now because we’re at the cusp of a lot of real­ly pro­found changes to human soci­ety and the plan­et that we live on, rang­ing from the impact of human civ­i­liza­tion on the cli­mate and the sys­tems of the Earth, to the increas­ing inter­con­nec­tion of human soci­ety in terms of the dig­i­tal world and a grow­ing thick­ness in the mesh of infor­ma­tion that wraps our world. To the dis­cov­er­ies that we seem to be on the cusp of in terms of cog­ni­tion, and mem­o­ry, bio­log­i­cal enhance­ment. So, in many ways we are begin­ning to rad­i­cal­ly rede­fine, or devel­op­ing the pow­er to rad­i­cal­ly define what it means to be human.

I think that dream­ing is very impor­tant and it’s some­thing that we tend to push aside, because we’ve become so focused on minu­tia. The explo­sion of infor­ma­tion, the facil­i­ty with which we can now access all of this infor­ma­tion, leaves us with no down­time. I find it fas­ci­nat­ing, for exam­ple, that in the Middle Ages peo­ple had this first sleep and sec­ond sleep. And sci­en­tists appar­ent­ly have fig­ured out that when you live on a diur­nal cycle and you’re just fol­low­ing the nat­ur­al pro­gres­sion of the sea­sons, the human body actu­al­ly nat­u­ral­ly adjusts to, in the win­ter­time wake up in the mid­dle of the night. You have a first sleep; you know, the sun goes down at five or six o’clock or what­ev­er it is. And you sleep for a few hours. And then you wake up. And you hang out for awhile—an hour, maybe two hours. And then you go back to sleep for a sec­ond REM cycle. And this is what your body nat­u­ral­ly does.

And so it’s shock­ing to think that there is this entire cul­tur­al phe­nom­e­non, this entire phase of the day, that does­n’t exist any­more. If it’s two hours out of twenty-four, that’s a twelfth of human exis­tence that some­how dis­ap­peared because we’ve start­ed to reg­u­late our­selves accord­ing to the fic­tion that we call time, the fic­tion of the twenty-four hour day.

That’s real­ly star­tling. And it was a very weird time. It’s not like being awake dur­ing the day. Your brain is in a very dif­fer­ent state, and you think of your­self in the world in a dif­fer­ent way. It’s obvi­ous­ly dark and there are only a few things that you can plau­si­bly do. And that whole phase of our exis­tence and that whole space is prob­a­bly a great time to day­dream, to reflect on what’s hap­pened or what will happen.

And so it’s real­ly star­tling to think about moments like that and rec­og­nize that we’re already pro­found­ly chang­ing what it is to be human. We now have these vast ten­drils of con­scious­ness that extend to all of our devices in dif­fer­ent places, our social net­works which are so much more com­plex and medi­at­ed than they used to be. We have a lot more pow­er but we also have a lot more…delegated respon­si­bil­i­ty where we’re count­ing on all of these algo­rithms and sys­tems to tell us how to feel and when to pay atten­tion to things.

So we need space for dream­ing, more than ever. Because dream­ing is a way that you prac­tice for the world. Scientists have stud­ied this as well and you can actu­al­ly see peo­ple rehears­ing par­tic­u­lar phys­i­cal move­ments and states of mind in your REM sleep. But we need to screen more time for day­dream­ing and for con­ver­sa­tion and reflec­tion about the future. Because that’s anoth­er way to prac­tice. That’s anoth­er way to talk through ideas. 

I’m real­ly con­cerned by the ways in which we’ve cre­at­ed this illu­sion of depth of field in terms of the kinds of infor­ma­tion that we can get online or through watch­ing tele­vi­sion or how­ev­er we get our news these days. But actu­al­ly, we end up becom­ing quite nar­row­ly focused, and things like Google pro­files end up shap­ing an infor­ma­tion­al uni­verse designed just for the indi­vid­ual. That’s real­ly star­tling because at a cer­tain point as we put more and more of our stuff online and rely more and more on these tech­ni­cal sys­tems to access our mem­o­ry banks, as it were, we’re increas­ing­ly going to live in a world, a Descartes sort of cog­i­to ergo sum” world that is indi­vid­ual for each of us. And that is anoth­er space that then becomes lost for dream­ing. It’s anoth­er lan­guage that we no longer have to dream togeth­er and to talk with oth­er people.

Anderson: Yeah, and that seems like a real­ly inter­est­ing part. Because when you were first talk­ing about dream­ing I was think­ing about okay, here’s the indi­vid­ual los­ing time and space for dream­ing in their life. But this is also the com­mu­ni­ty ver­sion, right, in that we do col­lec­tive­ly dream, and that is how we address big ques­tions of like, how do we live as a group? 

A ton of con­nec­tions there to oth­er things I’ve talked to peo­ple about in this project. And maybe maybe the biggest of these con­nec­tions was sort of the idea of the struc­ture of our soci­ety mak­ing it dif­fi­cult for us to dream. I was think­ing of John Zerzan, the neo­prim­i­tivist, the anti-technology thinker. And a big part of his cri­tique of tech­nol­o­gy is that tech­nol­o­gy has a bias, and that you can’t just say, No, tech­nol­o­gy can be used in a bunch of dif­fer­ent ways.” He says tech­nolo­gies have spe­cif­ic uses; they encour­age the cen­tral­iza­tion of con­trol­ling, they encour­age a type of reg­u­lar­iza­tion of things. 

And so I won­der, what’s the refu­ta­tion to sort of Zerzan’s claim that maybe this type of hyper­spe­cial­iza­tion we’re get­ting just in our social struc­ture… Well one, that that’s an inevitable result of tech­nol­o­gy. And two, that that would prob­a­bly lead to reduced dreaming.

Finn: It’s real­ly good question.

Anderson: Because I think he would also agree that we’re dream­ing less.

Finn: Yeah.

Anderson: But I think he would get there in a very dif­fer­ent way?

Finn: So I have a bet­ter answer to the sec­ond part than the first part. To the first part I’d say I’m not sure if it’s inevitable. But I do agree that tech­nol­o­gy is a struc­tur­ing force and that our tools shape us much more than we shape our tools. You know, we design some­thing and then it con­tin­ues to influ­ence us, and in future gen­er­a­tions for as long as we use that tool. 

So I think that point is well tak­en. And I think maybe the sec­ond part of my answer sheds some light on the first. I think that what we need to do to com­bat that prob­lem is to think about nar­ra­tive and sto­ry­telling and ideas of lit­er­a­cy. So this spring I’ll be teach­ing a class here at Arizona State University called Media Literacies and Composition. Part of what I want to do in the class is to devel­op a sense of algo­rith­mic lit­er­a­cy, to get my stu­dents to think about this dig­i­tal world, this infor­ma­tion­al land­scape, as one filled with cul­tur­al machines. You know, the algo­rithms that deter­mine which Facebook posts you see or don’t see. The algo­rithm that rec­om­mends books for you. These are machines that may have been designed by engi­neers who thought they were just solv­ing a tech­ni­cal prob­lem. But as Zerzan men­tioned, they actu­al­ly are pow­er machines, you know. They change the world. They change our cul­tur­al land­scape. They make cer­tain things impos­si­ble or pos­si­ble, or they make cer­tain things much more dif­fi­cult than oth­er things. 

The exam­ple I like to give my stu­dents when I talk about this is when you’re play­ing the game Grand Theft Auto, you just push one but­ton and you can steal any kind of vehi­cle, right. You pro­ceed through all these com­pli­cat­ed actions that most humans would­n’t nor­mal­ly be able to do. And so obvi­ous­ly that becomes the thing you do in the world because it’s the first word you learned in the lan­guage of Grand Theft Auto, and it’s some­thing that you can do wher­ev­er you are.

So I think that the response to this argu­ment about tech­no­log­i­cal deter­min­ism and grow­ing com­plex­i­ty is to focus on what’s always dis­tin­guished us as a species is this pow­er of nar­ra­tive and sto­ry­telling. To cre­ate forms of col­lec­tive belief and to rein­vent and reimag­ine the world. And I think that’s how peo­ple always respond to tech­nol­o­gy. If you’ve ever read the French author Michel de Certeau, he talks about ideas of poach­ing and resis­tance. He’s think­ing about the tech­nolo­gies in the struc­tures of the work­place, and the ways in which peo­ple might steal work time to work on a per­son­al project, or maybe you take a ream of paper home. 

These days, peo­ple engage in all sorts of com­pli­cat­ed games with the tools that we’re sup­posed to be using in seri­ous ways. You might have a real­ly saucy con­ver­sa­tion with the auto­mat­ed reser­va­tion agent on the air­line phone line, you know. Or you might write a com­plete­ly far­ci­cal memo at work and use satire as a force for dis­rupt­ing and sort of break­ing up some of these forces.

There are lots of things peo­ple do and I think peo­ple will always do them. Because humans are always humans. We aren’t ratio­nal actors. We aren’t machines. And as much as we like to pre­tend that we are at times, it’s very much a game of dress-up. And we always end up rein­scrib­ing or over­rid­ing these rules and cre­at­ing new spaces for us to be play­ful and cre­ative, and angry and depressed, and to do all of the things that these very pre­scribed dig­i­tal spaces were nev­er intend­ed to do in the first place. So I think that sto­ry­telling is real­ly the answer to that challenge.

Anderson: That’s inter­est­ing because Zerzan’s answer of course to that prob­lem is well, you can’t real­ly get around it, throw tech­nol­o­gy away. But what you’re talk­ing about makes me think much more of Douglas Rushkoff when he was writ­ing Program or Be Programmed and in our con­ver­sa­tion talk­ing about becom­ing lit­er­ate to these sys­tems and real­iz­ing that things that seem…normal? have bias­es and things pro­grammed into them. It’s inter­est­ing that with Zerzan and Rushkoff, both can say tech­nol­o­gy has a bias but one says you can reengi­neer the bias, and the oth­er one says you can’t, just get rid of technology. 

Finn: And you know, I think it’s impos­si­ble to get rid of tech­nol­o­gy. There are many species that use tech­nolo­gies to do stuff. And evo­lu­tion itself arguably is a tech­nol­o­gy or it enables us to devel­op tech­nolo­gies like oppos­able thumbs and binoc­u­lar vision. And I think that it’s real­ly implau­si­ble, even at that fun­da­men­tal lev­el, to imag­ine some sort of total aban­don­ment of technology.

So I’m much more in Rushkoff’s camp. And I think that one of the key and abid­ing val­ues of the human­i­ties and con­cep­tions of nar­ra­tive is that you’re real­ly teach­ing peo­ple how to think and how to see the world. And that is fun­da­men­tal­ly what’s impor­tant. To learn how to take a step back from the received wis­dom of what­ev­er this object is. To avoid just buy­ing into the rhetoric that you’re being offered in what­ev­er sit­u­a­tion you’re in. And to learn to ask your­self are there oth­er ways that this could be? Are there oth­er ways that this should be? This is some of the stuff I’m hop­ing to teach my stu­dents this spring. And I think that is the only way to main­tain our agency in the world.

Anderson: So, if tech­nol­o­gy is inevitable. And we’ve also talked about how it does have bias­es. But we’ve also talked about how those bias­es are fixed. And so, to cir­cle back to dream­ing from there, if we’re in a soci­ety that for many…it seems like tech­ni­cal rea­sons does not have time to dream—technical, eco­nom­ic, social rea­sons, does­n’t have time to dream…why not? What are the struc­tur­al things that we can do to encour­age dreaming?

Finn: So, I think it fun­da­men­tal­ly comes down to fram­ing the ques­tions that we explore in dif­fer­ent ways. Because we’re dream­ing all the time. I mean, all of those…the enter­tain­ment prod­ucts that I men­tioned before, shape the way we think about the world, the way we think about the future. The Cosby Show changed the way that we think about race in America. Shows like Star Trek changed the way we think about the future and space travel. 

I think the way we get peo­ple to dream more pro­duc­tive­ly is we change the fram­ing, and we get peo­ple to take the future a lit­tle more seri­ous­ly. One of the things that the Center for Science and the Imagination is try­ing to do is to get peo­ple to think about the future as a spec­trum of pos­si­bil­i­ty. Get peo­ple to think about the future as not some­thing that some peo­ple in white coats are build­ing in a lab some­where. As not the next prod­uct release from Apple. But as some­thing that we’re all invest­ed in, whether we’re mak­ing active con­scious choic­es about it or not.

So I real­ly think that we need to eman­ci­pate the future. We need to break it out of that lab, we need to break it out of the cor­po­rate prod­uct line-up and say no, it involves a lot more stuff. It involves every­thing. And there are a lot of dif­fer­ent path­ways that we could fol­low. When you start with that frame­work, then the con­ver­sa­tions you have can still involve all of the same things—they can still involve Star Trek or they could involve the car you dri­ve to work. But the fram­ing is dif­fer­ent and so then the sto­ries that you tell about that stuff, they’re going to be dif­fer­ent stories.

Anderson: Why do you think we have this dis­em­pow­ered sense when we think about the future? Is that relat­ed to com­plex­i­ty else­where in our soci­ety, where it feels like soci­ety is so com­plex that when we look at it as indi­vid­u­als it feels like we are much too small and we’ve start­ed to real­ize the daunt­ing com­plex­i­ty of the thing that we have cre­at­ed and are embed­ded in?

Finn: I think there are a lot of fac­tors in it. I mean, one fac­tor is that we haven’t had a real­ly con­vinc­ing glob­al nar­ra­tive for the future since 1989. Nothing helps sharp­en your sense of belong­ing in progress like a good ene­my, right. And so that was a phase after World War II where there was…rel­a­tive sta­bil­i­ty on our plan­et. And peo­ple were work­ing hard and there was you know, gen­er­al­ly a sense of progress. And there was a sense of a col­lec­tive path­way and a col­lec­tive vision. And I think ever since then we’ve been strug­gling to come up with a new ver­sion of that. 

But I don’t think that it’s real­ly true that we’ve been com­plete­ly way­ward and lost. Because we keep see­ing these incred­i­ble man­i­fes­ta­tions of the pos­i­tive pow­er of the net­work and these dig­i­tal tech­nolo­gies. The Arab spring, which upend­ed exact­ly the kind of regimes that the Cold War Era would have described as essen­tial­ly endless.

And the pow­er of a good sto­ry is that you can just trans­form behav­ior overnight. I had a con­ver­sa­tion with Peter Peter Byck, who’s a direc­tor who made the doc­u­men­tary Carbon Nation a few weeks ago. And he is try­ing to fig­ure out how he can get peo­ple to think of sus­tain­abil­i­ty and cli­mate change as a World War II-style moment. Because after Pearl Harbor the United States, with­in a mat­ter of months com­plete­ly trans­formed the man­u­fac­tur­ing and pro­duc­tion cycle of the US. We went from a near­ly total peace­time man­u­fac­tur­ing struc­ture to this incred­i­ble war machine that just kept get­ting stronger and stronger over the next few years. But even with­in that first year they were pro­duc­ing thou­sands of planes, they were pro­duc­ing all these ships. And if you were mea­sur­ing it in per­cent­age points, you know this huge per­cent­age of the population—you would see these huge shifts in social behav­ior, in a very short amount of time.

And one of the prob­lems we have when we think about tech­nol­o­gy, and this is some­thing that the dean of the School of Sustainability here at ASU has said, is when we think about just try­ing to solve these prob­lems with tech­nolo­gies, you’re strug­gling to find a 5%, a 10% improve­ment in some par­tic­u­lar cor­ner of say, the cli­mate change and sus­tain­abil­i­ty con­ver­sa­tion. But tech­nol­o­gy is what got us into this prob­lem in the first place, you know. And if you keep try­ing to come up with inno­va­tion to get your­self out of the hole that you inno­vat­ed your­self into, you’re not going to be able to do it. You have to come up with those trans­for­ma­tive sto­ries. You have to get peo­ple on the right page to under­stand a broad­er social con­text. And that becomes a ques­tion of agency again. Because if you just tell peo­ple exact­ly what they have to do to resolve this prob­lem, you’re nev­er going to suc­ceed because peo­ple aren’t going to do it, they’re not going to under­stand why, they’re not going to care. But if you can con­vince peo­ple to care, if you can come up with that nar­ra­tive and let them fig­ure out what it is they should do, you give them the agency and you del­e­gate this respon­si­bil­i­ty to the future to them, then you start to see real­ly pro­found changes.

Anderson: This con­nects out two real­ly inter­est­ing con­ver­sa­tions I record­ed on oppo­site ends of the coun­try. One was with David Korten of the New Economy Working Group. We talked a bit about nar­ra­tive, which real­ly sur­prised me. And he felt you need a new nar­ra­tive, the nar­ra­tive of being embedded. 

And kind of on the oth­er end of the coun­try how that man­i­fest­ed itself polit­i­cal­ly was in this oth­er con­ver­sa­tion I had with a guy named Puck Mykleby, and he’s a Marine colonel. And he along with anoth­er guy in the Pentagon redesigned the United State’s strate­gic nar­ra­tive, to go from con­tain­ment to sus­tain­ment, broad­en­ing the def­i­n­i­tion of what is the cur­rent threat. Well, the cur­rent threat is sus­tain­abil­i­ty. And how do you get to that? Well, it’s not just an envi­ron­men­tal ques­tion. The strate­gic imper­a­tive for the coun­try is par­tial­ly envi­ron­men­tal, but it’s also all these oth­er things that lead you to think­ing environmentally. 

Do you think that we can have a new nar­ra­tive around some­thing as neb­u­lous as sus­tain­abil­i­ty? You know, I mean Russia, you can paint it red on a map, but sus­tain­able cli­mate? That’s fuzzy. We can’t even agree on whether or not that exists.

Finn: I real­ly like your men­tion­ing of this idea of embed­ded con­scious­ness, and frames of thought, and cul­tur­al frames that rec­og­nize the we’re embed­ded in dif­fer­ent sys­tems. It reminds me of Buckminster Fuller’s great books Spaceship Earth and the idea that the Earth is a com­pli­cat­ed organ­ism, a mech­a­nism, that has all these dif­fer­ent mov­ing parts and we’re just one ele­ment in a much broad­er sys­tem. Which both reduces our sense of hubris and makes us rec­og­nize that we’re part of some­thing much big­ger. But also still increas­es our sense of respon­si­bil­i­ty by point­ing out that when we screw up cer­tain things it has all of these far-reaching impli­ca­tions with the rest of the sys­tem because we’re still part of it, this much big­ger thing.

So what I think is impor­tant is to rec­og­nize that we’re also embed­ded in time, and to expand the hori­zons that we think of. One of the real­ly pro­found chal­lenges we have right now is that I think we don’t think very far ahead into the future. Politics in America is not very well-suited to think­ing even one or two years ahead, much less ten years ahead, or twen­ty years ahead. That’s a major prob­lem, and I think that’s actu­al­ly how we should tack­le this ques­tion of sus­tain­abil­i­ty and bet­ter nar­ra­tives. Because once you extend your hori­zon just a lit­tle bit far­ther out, you start to see how sus­tain­abil­i­ty is real­ly in your self inter­est. It’s not about believ­ing in some new reli­gion. It’s about oh, if I do this now I’m going to save all this mon­ey and I’m going to save all this time. When you extend your vision in the tem­po­ral hori­zon you begin to see all of those hid­den costs become more vis­i­ble, and all of the exter­nal­ized prob­lems that we don’t see reflect­ed in the price tags in our super­mar­ket sud­den­ly become a lot more obvi­ous. And I think that’s real­ly important.

Now, I don’t think that’s nec­es­sar­i­ly easy. And I think that again it’s the kind of thing where you have to come up with a whole set of nar­ra­tives and sto­ries that con­nect to one anoth­er, and grad­u­al­ly shift con­scious­ness. And I think it ranges from peo­ple who believe very much in that God and man-centered uni­verse, evan­gel­i­cals who say we’re stew­ards of the Earth and it is our respon­si­bil­i­ty to pro­tect this world that God has giv­en us, to those who say we adopt sus­tain­abil­i­ty as a strate­gic ini­tia­tive, we adopt sus­tain­abil­i­ty as a core nation­al defense issue. To peo­ple who just say, You know, I think this is a moral imper­a­tive because many peo­ple will suf­fer if the oceans rise or crops start dying out.”

So I think there are all these dif­fer­ent nar­ra­tives, but I think that the core of what needs to change is we need to expand that hori­zon of our think­ing. The 10,000 Year Clock from the Long Now Foundation [crosstalk] is a nice example.

Anderson: Yeah, I spoke to Alexander Rose and I was think­ing about him as you were say­ing that, you know, because he was talk­ing about once you expand your time hori­zon, then a lot of things that seem impos­si­ble to solve become pos­si­ble. And I think what I have to ask when I hear a claim like that is of course, are we bio­log­i­cal­ly inclined to favor the short-term? And, if we are is nar­ra­tive the way to sort of trick our­selves into think­ing long-term because you’ve got this deep­er under­stand­ing that is now part of a sto­ry, and that’s why you don’t like, eat all the choco­late right now.

Finn: We may nev­er have been as short­sight­ed as we are now.

Anderson: Really?

Finn: Think about the plan­ning and vision it took to build the pyra­mids. The cathe­drals that took cen­turies to build. People used to have a very dif­fer­ent under­stand­ing of them­selves in the broad­er uni­verse. That Christian you know, Western reli­gious mythos of God and man work­ing togeth­er put man at the cen­ter of the uni­verse but also had a very dif­fer­ent tem­po­ral frame for how we lived in the world. This present time on Earth was real­ly just a small part of what would be an eter­nal exis­tence, and that you know, if you lived a good Christian life you would go on to Heaven and the time you spend on Earth, the suf­fer­ing that you endure here is going to be insignif­i­cant in that broad­er frame. 

And when you start think­ing of the world in those terms it sud­den­ly becomes rea­son­able to say well you know, I’m going to spend my thir­ty years of pro­fes­sion­al life or you know, prob­a­bly fif­teen or twen­ty years at that peri­od in his­to­ry of pro­fes­sion­al life, work­ing on this build­ing. It’s not going to be done. It’s not even going to be half-done by the time I die. And I know I’m going to die not so long from now. But this is good work, and it’s part of some­thing big­ger, and I believe in that big­ger thing.

If you can just estab­lish that frame, then all of these oth­er ele­ments start to fall into place. And I don’t think that it requires some kind of rad­i­cal bio­log­i­cal engi­neer­ing of the human mind or any­thing like that. Sure, we have these very pow­er­ful, innate sys­tems in our brains that tell us how to act. But I’ve got to say I think it’s noth­ing on the pow­er of the social, and the pow­er of the sys­tems that we build com­mu­nal­ly. You know, every­thing that we’ve done as a species is because we’ve learned to work togeth­er, to col­lab­o­rate in these incred­i­bly pow­er­ful, explo­sive, excit­ing, dan­ger­ous, thrilling ways. And the things we can achieve when we just come up with a sto­ry that we all buy into are limitless.

Anderson: It seems like you feel that we are too eager to sort of con­sign these things to biol­o­gy, in a way, that almost makes them inevitable and then take them off the table for discussion?

Finn: Yeah. I val­ue the study of evo­lu­tion­ary psy­chol­o­gy and I val­ue the study of cog­ni­tion in the ways in which we oper­ate. I think that’s extreme­ly impor­tant and fas­ci­nat­ing work. But I think it’s a lit­tle too easy to push things off the table because we say well, that’s just how humans are built and there’s no way to change that. I think under­stand­ing how our innate sys­tems work, how our moti­va­tions work, is very impor­tant. But you can use that knowl­edge to change almost any kind of human behavior.

Anderson: An avenue actu­al­ly I think we need to explore is of course what hap­pens if we keep dream­ing in ways that are very incre­men­tal and small. Or we are will­ing to out­source our dream­ing to a research lab, or to peo­ple who seem smarter than us and we see on…you know, get­ting a TED talk and we’re like, Well, those peo­ple have real­ly got it. They’ve got lots of advanced degrees. I’m just gonna go to work, take care of my fam­i­ly. They’re dreaming.”

Finn: So I think one of the prob­lems we have as humans is that we are pattern-seeking ani­mals. And we like genre. We’re kind of lazy. And so we like sto­ries that makes sense because we know how they’re going to turn out and they fit into what’s a nor­mal, stan­dard, offi­cial­ly approved, accept­able end­ing. And so we tend to out­source our real­ly ambi­tious think­ing and our risk-taking and our dream­ing to a few dif­fer­ent groups of peo­ple who then become the sort of pro­fes­sion­al risk-takers and dreamers. 

But they get sucked into the genre, too. Genre has rules, and genre has bound­aries. And TED talks are a kind of genre. The Silicon Valley entre­pre­neur is a genre. The med­ical researcher bat­tling can­cer has a genre. And all of these peo­ple are real­ly smart and they do real­ly impor­tant and inter­est­ing work. And some of them might even see the genre walls around them and want to break out but you know, they’ve got to apply for the next grant. They’ve got to pitch this idea to a very par­tic­u­lar audi­ence. And so they play to the crowd, and they play to the genre rules that are set up before them. And we end up repeat­ing and reca­pit­u­lat­ing ideas that we’ve had before. And we end up los­ing a lot of real­ly good infor­ma­tion and good think­ing because it does­n’t fit into the very unique genre of sci­ence fic­tion that is the grant proposal.

Anderson: Early in our con­ver­sa­tion we men­tioned like, envi­ron­men­tal ques­tions or the inte­gra­tion of social and eco­nom­ic sys­tems which you need to work to have peo­ple sur­vive. Or the poten­tial­i­ty of alter­ing the genome, or cyber­net­i­cal­ly enhanced our­selves. Those are some of the issues. And if not dream­ing is a prob­lem, where does it go if we are trapped in genres?

Finn: Well I think peo­ple are dream­ing. I think that there’s been a shift in the zeit­geist, in the past year or two espe­cial­ly. And this project you’re work­ing on is one exam­ple. The White House Office of Science and Technology Policy has this new Grand Challenges ini­tia­tive. There a lot of peo­ple who are rec­og­niz­ing that we need to get back to think­ing big and doing big stuff, being real­ly ambi­tious in our work.

Of course, there have always been peo­ple doing that. And I think some of our most suc­cess­ful, world-changing fig­ures are peo­ple who had some real­ly big ambi­tious idea and they just went for it. We are cre­at­ing these broad­er con­ver­sa­tions but I think it’s impor­tant for what we’re doing here at the Center and at ASU more gen­er­al­ly to keep doing it, and to lead by exam­ple. And I think one of the great chal­lenges when you’re try­ing some­thing new is tak­ing that first step or con­duct­ing that first exper­i­ment. And so one of the things I’m most excit­ed about as we launch this new thing is that we’re just doing stuff. We’re try­ing it out and we’re going to see how it works. A real­ly impor­tant part of dream­ing and of good cre­ative work in gen­er­al is to embrace fail­ure as a valid and valu­able out­come, some­thing that you can learn from, some­thing that if it does­n’t hap­pen often enough you should be a lit­tle sus­pi­cious. And some­thing that is real­ly in a lot of ways no less use­ful than suc­cess in help­ing to learn and shape your think­ing about what you’re doing.

Anderson: We’ve been talk­ing a lot about dream­ing. The implic­it assump­tion being that we can do bet­ter. What does a bet­ter future look like? And of course, obvi­ous­ly you’re inter­est­ed in cre­at­ing a space for a con­ver­sa­tion; you’re not going to be so dog­mat­ic that you have one. But in gen­er­al sort of para­me­ters, what are some good things to be work­ing towards?

Finn: I think I’d like to talk about thought­ful opti­mism, which is what I am try­ing to work toward with the Center. I am not endors­ing the Panglossian view of the world as this is the best of all pos­si­ble worlds; the future’s going to be even bet­ter; it’s all gonna be great; don’t wor­ry about it.” That’s real­ly not what I’m try­ing to do. If we think about the future as a space of pos­si­bil­i­ty, then we can avoid the fal­la­cy of deter­min­ism. We can avoid think­ing that the peo­ple else­where, the peo­ple in the white coats, or gov­ern­ments, or some oth­er third par­ty enti­ty, or the machin­ery of tech­nol­o­gy and civ­i­liza­tion itself is some­how going to just grind out the future that we’re going to live in and we have no pow­er to change it.

Instead, we need to try and fig­ure out what we want to hap­pen, and we need to come up with a real­ly great sto­ry about why that should hap­pen. And if enough peo­ple believe that sto­ry than we can build towards that world. We can fig­ure out how to get there. 

I real­ly think that’s at the core of the world that I want to live in, is hav­ing that con­ver­sa­tion and con­nect­ing the human­i­ties, and the sci­ences, and the arts in ways that don’t end up priv­i­lege in one or the oth­er or per­pe­trat­ing the weird nar­ra­tive we have now that like oh, sci­en­tists can’t real­ly be cre­ative, or if you’re an artist you can’t do math. but just to cre­ate a whole new axis along this idea of the imag­i­na­tion. And wel­come every­body into it.

There was this pas­sage that one of my teach­ers in high school made us all mem­o­rize. And I can only remem­ber a small frag­ment of it now but it was about what makes for a good edu­ca­tion. The part that I remem­ber, iron­i­cal­ly, is—to para­phrase the part that I don’t remember—is you know, you’re going to learn a lot of stuff in school, and you may for­get some of it. But the shad­ow of lost knowl­edge at least pro­tects you for many illu­sions. And I think that’s a real­ly impor­tant func­tion of a good edu­ca­tion, is to expose peo­ple to all of these dif­fer­ent intel­lec­tu­al struc­tures, all of these dif­fer­ent sys­tems of think­ing and ways of think­ing. And you don’t have to embrace all of them and you don’t even have to remem­ber every­thing that you learn. When you have enough diver­si­ty of thought like that, you get peo­ple mak­ing more thought­ful deci­sions. You get peo­ple avoid­ing more obvi­ous pit­falls. You get bet­ter stories. 

Anderson: I spoke to a philoso­pher, Lawrence Torcello in this project, and we talked a lot about con­ver­sa­tion and how do you bring peo­ple with dif­fer­ent ara­tional assump­tions togeth­er to talk about a col­lec­tive future. And he was talk­ing about plu­ral­ism and clas­si­cal lib­er­al­ism. You know, need­ing to work towards a com­mon good but there are times, there are peo­ple, who are just nev­er ever going to have that con­ver­sa­tion with you.

And he end­ed up on sort of a…pessimistic note, in a lot of ways. So I think what’s inter­est­ing in our con­ver­sa­tion today—and maybe this is a note to close on, is the idea that nar­ra­tive can maybe get you around some of that. That you can have this sort of clas­si­cal Enlightenment-era Jeffersonian con­ver­sa­tion about gov­ern­ment. And it won’t work, because you’re talk­ing to some­one who has a fun­da­men­tal­ly dif­fer­ent idea of the world than you. Do you think you can bridge some of that through shar­ing a sto­ry? Which is per­haps almost ambiva­lent enough that every­one can read a lit­tle bit dif­fer­ent into it but maybe also gives you a com­mon val­ue which isn’t stat­ed in an explic­it legal way?

Finn: I have two answers for you. The first answer is that there are cer­tain cul­tur­al objects and ideas that can serve as real­ly help­ful com­mon ground, where every­body thinks they know how to read this thing. Because they all feel like they kind of own it; they know what it is and they know how to talk about it. Then when you have the actu­al con­ver­sa­tion, every­body’s quite sur­prised, because they have very dif­fer­ent visions of this sin­gle thing. 

Finding the com­mon ground, find­ing that space, that open space where peo­ple can actu­al­ly have the con­ver­sa­tion, is real­ly the most impor­tant step. Because as soon as they start talk­ing to one anoth­er, they are exchang­ing ideas, right, and they’re engag­ing in the fun­da­men­tal cul­tur­al process of com­mu­ni­ca­tion. In a lot of ways what we strug­gle with now, espe­cial­ly in the US, is a break­down in that fun­da­men­tal kind of communication.

But my sec­ond answer is a lit­tle trick­i­er. We are pattern-seeking ani­mals but we also under­stand the world through nar­ra­tive. Our brain is fun­da­men­tal­ly a nar­ra­tive engine. It takes all the sen­so­ry input and it makes these sto­ries out of it. And that’s why we ignore lots of things; that’s why mag­ic tricks work, because you can kind of glitch the nar­ra­tive engine, right, you can trick it in dif­fer­ent ways. That’s how pres­i­den­tial elec­tions work, too.

And so, nar­ra­tive itself, in some sense, real­ly is real­i­ty. Now, I’m sure there are physi­cists who would strong­ly dis­agree and say there’s this pos­i­tivis­tic exter­nal uni­verse. And that’s also a nar­ra­tive so I’m not sure that I dis­agree. I think we prob­a­bly dis­agree about how far down the tur­tles go. 

But, I think that from a cul­tur­al per­spec­tive, nar­ra­tive is real­ly the world itself. And so when you’re talk­ing about get­ting peo­ple to share nar­ra­tives and come to com­mu­nal under­stand­ings, it’s very tricky to say where that ends and where an empir­i­cal uni­verse begins. From our every­day expe­ri­ence as human beings, and espe­cial­ly our expe­ri­ence as cul­tur­al actors, we cre­ate these nar­ra­tives of the uni­verse and that’s where we live. And that’s where we make all of our deci­sions. So in a lot of ways I think nar­ra­tive is real­ly the whole ball­game. And so if you can get peo­ple talk­ing about things, if you can come up with nar­ra­tives that peo­ple can share—and maybe not sure entire­ly but at least step into some­body else’s con­scious­ness. Because until cog­ni­tive sci­ence makes sev­er­al great leaps, nar­ra­tive is real­ly the only tech­nol­o­gy we have for actu­al­ly inhab­it­ing some­body else’s mind in terms of true empa­thy. And that’s the fun­da­men­tal engine of com­mu­ni­ty and the fun­da­men­tal bridge between the mon­ad, the indi­vid­ual lost in space, and a sense of the collective.


Neil Prendergast: Well, Aengus. That was one of the most abstract con­ver­sa­tions I think we’ve seen so far.

Aengus Anderson: And I real­ly like it. Maybe it’s because I’m a human­i­ties dork and I’m always excit­ed when some­one says nar­ra­tive is real­i­ty.”

Prendergast: Yeah, I was sort of, I admit, sucked in to his dis­cus­sion of nar­ra­tive because I see a lot of val­ue in sto­ry­telling myself, too.

Anderson: You know, we’ve talked about nar­ra­tive before. In this piece I men­tioned David Korten’s con­ver­sa­tion. But it’s been a while since it’s come up, and I’m glad were actu­al­ly devot­ing a big chunk of time to it—like a full episode, basi­cal­ly, to that and dream­ing. So let’s maybe break down how these dif­fer­ent pieces work togeth­er and what they mean.

One of the big under­ly­ing issues here is that… God, I think Rushkoff said this. We’ve lost the narrative.

Prendergast: Right. If I recall cor­rect­ly, Rushkoff was say­ing in our con­ver­sa­tion that the sort of bar­rage of media that’s in front of us today has made it dif­fi­cult to see any kind of coher­ence among all these dif­fer­ent voic­es. And also to sort of see any kind of change over time, which is of course also an incred­i­bly impor­tant com­po­nent of nar­ra­tive. Instead we’re sort of stuck not real­ly know­ing where to look. Not real­ly know­ing what the sto­ry is. So I think it’s a great moment for some­one like Ed Finn to step in and make an argu­ment for narrative.

Anderson: Right. And it’s inter­est­ing, you know, just gonna bounce anoth­er thing off of Rushkoff’s idea there. We also talked to Ethan Zuckerman a long time ago. And he was talk­ing about the role of serendip­i­ty, in that one of the chal­lenges we were fac­ing with…well, the Internet age, is that you get search, which is you look­ing direct­ly for some­thing. And you get social, which is our friends point­ing you to their arti­cles. But if you real­ly want to pull in things out­side of your group, you need some­thing else. Zuckerman was advo­cat­ing out basi­cal­ly like a serendip­i­ty engine…

Prendergast: Right. So Zuckerman seemed to be say­ing that peo­ple get stuck in these eddies on the Internet, where they can’t look out to see the big pic­ture. And it seemed like Rushkoff was say­ing well, even if you’re not stuck in an eddy, it does­n’t mat­ter because the way you expe­ri­ence the Internet is to not see the big pic­ture because there’s so many things com­ing at you so quick­ly and they’re all in such small pieces.

So it seems like even though the cri­tiques are dif­fer­ent, which I real­ly enjoy, those dif­fer­ences, it seems like they’re com­ing to the same point that Ed Finn can pick up on, which is, Hey, where is that big sto­ry?” And you know, I think as you point out, where is that bit Cold War sto­ry, or some­thing of that scale?

Anderson: Can we even have that Cold War sto­ry again? And this isn’t some­thing we get into as much in the piece but I think it’s some­thing that we should talk about here. Because obvi­ous­ly the Cold War sto­ry exists in a very dif­fer­ent tech­no­log­i­cal and media land­scape than the one that we’re liv­ing in now.

This is where I think Rushkoff and Zuckerman, and even going back to Andrew Keen much ear­li­er in the project—you know, a lot of peo­ple who talk about the role of com­mu­ni­ca­tions tech­nol­o­gy in our larg­er polit­i­calscape. And I think this is where they become real­ly rel­e­vant. Can we have the kind of dream, or the kind of nar­ra­tive, that Ed wants us to have now? Is it even pos­si­ble? I think Zuckerman hold out that maybe it is pos­si­ble if you get some­thing like the serendip­i­ty engine. You know, he real­ly dis­put­ed the idea that we could­n’t keep up with the infor­ma­tion because of time. He felt that we could­n’t keep up with the infor­ma­tion because we weren’t inter­est­ed. Because we almost weren’t dis­ci­plined enough to be eclec­tic con­sumers of infor­ma­tion. I don’t think Rushkoff felt that way. I think Rushkoff felt more that we’d hit our bio­log­i­cal lim­its and there was so much infor­ma­tion that was the real problem.

Prendergast: It seems like those are the cri­tiques regard­ing the tech­nol­o­gy. And you know, I’d also add in that Finn has sort of touched upon a cri­tique that’s…a lit­tle sort of cul­tur­al? Not to sort of total­ly sep­a­rate that out from tech­nol­o­gy? But the Cold War was a dif­fer­ent moment. The Cold War was a moment when the United States seemed to feel all the resources of the world are at our fin­ger­tips. Certainly there was fear of the Soviet Union. But there was also the suc­cess from World War II that drove the nation­al feel­ing. And I think the cur­rent feel­ing is dif­fer­ent. The War on Terror is such a defen­sive feel­ing, as com­pared to the Cold War. And that might seem odd to say. The word defense” was all over the Cold War era. But I don’t think peo­ple always felt offen­sive. But I think today it’s a much more com­mon feeling.

Anderson: Right. And I mean, just the fact that the Cold War is some­thing that you can…you can iden­ti­fy your adver­sary, right.

Prendergast: Right.

Anderson: And the War on Terror, as a nar­ra­tive, does­n’t have that qual­i­ty. It’s dif­fi­cult to fig­ure out when does it end? What are its para­me­ters? How is it won? Is it won? If it’s not won is it real­ly a nar­ra­tive in the same way? Does it moti­vate in the same way? Or is it more like some­thing you just try to push into the back­ground, you know? It does­n’t inspire the same sort of Herculean efforts towards great­ness, in a way. It inspires more air­port checkpoints.

Prendergast: Right. And of course you know, Americans don’t share the assump­tions that it’s the good war” in the same way that so many Americans shared that assump­tion for the Cold War.

Anderson: Yet, Ed seems to think that it’s pos­si­ble to cre­ate a new nar­ra­tive, right. If he did­n’t think that he would­n’t be doing his job. He prob­a­bly would­n’t have got­ten hired.

Prendergast: Right.

Anderson: He talks about thought­ful opti­mism, and a big part of that seems to be a con­fi­dence or maybe a faith or a hope that despite the cur­rent media land­scape, we can share a pret­ty broad nar­ra­tive. And I’m not real­ly sure I’m con­vinced of that. I mean, I want to be con­vinced of that. But I feel like things that Ethan Zuckerman point­ed out are real­ly sig­nif­i­cant prob­lems. That with such a frag­ment­ed media it’s hard to get that kind of com­mon ground for a narrative. 

Prendergast: And so I think there may be a ques­tion here about whether or not the type of nar­ra­tive that Ed Finn is describ­ing is pos­si­ble on the Web. Or maybe it’s only pos­si­ble else­where and it would just be mar­ket­ed through the Internet.

Anderson: So I guess the ques­tion is just one of how do you view the dig­i­tal tech­nol­o­gy? Do you think that it can get us to a broad nar­ra­tive? Or do you think that we can get to a point where cul­tur­al­ly we’re so des­per­ate for a nar­ra­tive that we all went to one com­mon place, despite hav­ing a frag­ment­ed media?

Prendergast: Yeah, and I think that’s sim­ply an open ques­tion, as we’ve seen peo­ple in The Conversation have dis­agreed about that. I think that that just kind of remains to be seen.

Anderson: Mm hm.

Prendergast: But I think that one thing that Ed Finn is say­ing that makes a lot of sense to me is that sto­ry­telling will always be with us because it is sim­ply a part of who we are as humans. He I think men­tions at one stage very briefly the way the brain func­tions. And I find that to be sort of a fas­ci­nat­ing com­po­nent of sto­ry­telling, in that sto­ry­telling con­nects that sort of log­i­cal part of your brain—the left brain—with the more sort of intu­itive, emo­tion­al part of your brain—the right brain. And as I think every­body knows, good sto­ries are log­i­cal. The cause and effect makes sense. But they also take you some­where emo­tion­al­ly. And I think that’s why they’re so sat­is­fy­ing. It’s why sto­ry­telling so won­der­ful for us as humans, because it lets us trav­el back and forth between these two dif­fer­ent parts of the brain.

So I would say that bio­log­i­cal­ly we’re hard­wired to want this. It’s obvi­ous­ly an open ques­tion how we’re able to get it, and if we can get it on such a scale that every­one in the coun­try, or a hemi­sphere, or even larg­er scale can par­tic­i­pate in the same nar­ra­tive. But I think there’s no ques­tion that we’re a nar­ra­tive species.

Anderson: That’s real­ly intrigu­ing. And, what I like is that you actu­al­ly, whether wit­ting­ly or not, you real­ly set the stage for an upcom­ing interview—my con­ver­sa­tion with George Lakoff, when we go way into talk­ing about the brain, and sto­ries, and well…how that affects pol­i­tics. So we’ll just leave that as a nice lit­tle bit of fore­shad­ow­ing for a real­ly great up and com­ing conversation. 

At the same time, hav­ing said that, I think there’s a real­ly impor­tant ques­tion which actu­al­ly ties into some­thing you said ear­li­er about defen­sive.” I think you men­tioned the word defen­sive. And, we may find it innate­ly sat­is­fy­ing to have nar­ra­tives as humans—that’s fine and dandy, right. But we can have vicious­ly com­pet­ing nar­ra­tives. We could have two giant nar­ra­tives with two rad­i­cal­ly dif­fer­ent polit­i­cal camps—does this sound like I’m describ­ing real­i­ty at all? [Prendergast laughs] And maybe we’re too defen­sive to even lis­ten to the oth­er side’s nar­ra­tive. And I mean, Ed seems to have faith that the nar­ra­tive can be an ambigu­ous enough thing where peo­ple can read dif­fer­ent ele­ments into it. But is that real­ly the case? Or are we too defen­sive to even hear each oth­er’s narratives.

Prendergast: Right. So it seems like we can have on the one hand nar­ra­tives that are per­haps a lit­tle explic­it about what they mean, maybe, polit­i­cal­ly. On the oth­er hand we can have nar­ra­tives that are so ambigu­ous that you can read any­thing you want into them. And the where is the conversation?

Anderson: In which case is it even a use­ful nar­ra­tive, right?

Prendergast: Right. So maybe the ques­tion is you know, how do you find some­thing in the mag­ic mid­dle? And maybe that’s quite a dif­fi­cult task. And I think it kind of runs against, actu­al­ly, the project here. One our posi­tions I think is that we want The Conversation to be possible.

Anderson: Sort of des­per­ate­ly. If nar­ra­tive works in the way that Ed wants it to work, then it should bring peo­ple togeth­er to have a con­ver­sa­tion. Its very ambi­gu­i­ty should do that. I don’t know if I buy that, but it’s a great asser­tion. But if that’s true, then does a project like this mat­ter at all? This project, The Conversation, is about sort of explic­it­ly talk­ing about what is good. It’s based on the notion that you’re going to sit down—and we’re not craft­ing a nar­ra­tive here. We’re talk­ing about val­ues right out in the open. We’re talk­ing about phi­los­o­phy in the open. We’re talk­ing about peo­ple’s spir­i­tu­al beliefs out in the open. And that takes a real lev­el of can­dor that we’re lucky to have from most of our inter­vie­wees. But it’s very dif­fer­ent than hav­ing a nar­ra­tive bring peo­ple togeth­er, isn’t it?

Prendergast: I think you’re right. I think that we are being a lot more explic­it about ideas than can hap­pen, or that maybe com­mon­ly hap­pens in say some sci­ence fic­tion. Although some sci­ence fic­tion is pret­ty explic­it about the val­ues embed­ded in it. But cer­tain­ly we’re run­ning right towards the explic­it here in The Conversation. And maybe that’s cre­at­ing too hard a line to open it up to oth­ers. I’m not sure.

But to kind of mark us on a spec­trum, I do think that we’re a long way away from the sort of con­fronta­tion­al style of com­mu­ni­ca­tion that you would find say on cable news.

Anderson: Right. And I mean that’s— You know, part of the hope of this project is that you don’t need the nar­ra­tive, in a way, to bring peo­ple togeth­er. Although it’s great, and I like the idea. But that was real­ly nev­er even in our think­ing when we were work­ing on this project. Our think­ing was go beneath the cur­rent con­ver­sa­tion that’s about real­ly obvi­ous issues and try to get into stuff that under­lies them. And I feel like we’re— Like Ed Finn’s nar­ra­tive is basi­cal­ly try­ing to cre­ate a depoliti­cized space where peo­ple of dif­fer­ent back­grounds can talk safe­ly about the future. And I feel like we’re try­ing to do that same thing with…abstract phi­los­o­phy a lot of the time in this project.

Prendergast: Right, right.

Anderson: And I’m not sure if we’re going to do it any bet­ter. The nar­ra­tive is prob­a­bly a bet­ter hope, and I’m not—in my heart I’m not con­vinced that that will work either. But I like the idea that there are these dif­fer­ent roads of doing it.

Prendergast: I feel very much like we are in league with Ed Finn.

Anderson: Oh God, yes. And maybe that’s one of the rea­sons this was such a fun conversation.

That was Ed Finn record­ed December 19th, 2012 in Tempe, Arizona on the cam­pus of Arizona State University.

Micah Saul: And you are of course lis­ten­ing to The Conversation. Find us on the web at find​the​con​ver​sa​tion​.com.

Prendergast: You can fol­low us on Twitter at @aengusanderson.

Saul: I’m Micah Saul.

Prendergast: I’m Neil Prendergast.

Anderson: And I’m Aengus Anderson. Thanks for listening.

Further Reference

This inter­view at the Conversation web site, with project notes, com­ments, and tax­o­nom­ic orga­ni­za­tion spe­cif­ic to The Conversation.