Aengus Anderson: Well Micah Saul is not joining us today, but he will be back next time. So in the interim you're stuck with just me, Aengus Anderson…

Neil Prendergast: And me, Neil Prendergast.

Anderson: And this episode is my conversation with Rebecca Costa. She's a self-proclaimed sociobiologist, a former advertising executive, a radio talk show host—The Costa Report may be syndicated in your community—and most notably the author of The Watchman's Rattle A Radical New Theory of Collapse.

Prendergast: We first learned about her work early on in our episodes. She shares with us this notion that our own human biology has not kept up with the complexity of human society. And it seemed like an idea worth investigating a little bit more.


Rebecca Costa: If you were to ask me what the cri­sis in the present is, as an evo­lu­tion­ary biol­o­gist I have to go back mil­lions of years and try to con­nect all the dots, going back to man as a single‐celled organ­ism to present time, and say­ing what is it that is caus­ing mod­ern con­ster­na­tion? More impor­tant­ly, is there a pat­tern? Has this hap­pened before? Were there some ordi­nary peo­ple like you and I, shop­keep­ers in Rome, who were stand­ing around and say­ing, You know, our lead­ers don’t seem to be on top of our prob­lems. They seem to be get­ting worse one gen­er­a­tion after anoth­er.”

There had to have been some ordi­nary cit­i­zens who felt like things were get­ting away from them in some sub­stan­tial way. And that’s what drove me to start look­ing at whether there were any ear­ly symp­toms of demise of a civ­i­liza­tion. Because we’re fas­ci­nat­ed with the cat­a­clysmic event that shoves us over the cliff. You know, it’s sus­pense­ful, gets our adren­a­line going, and in many ways we’re pro­grammed to respond to short‐term fear. I throw a snake down on the ground and your your body floods with chem­i­cals; it wants to flee or attack the snake.

But we’re not very good at look­ing at long-term threats which are sub­stan­tial­ly larg­er than a snake. We have no phys­i­o­log­i­cal response, at least to this point in time; we haven’t devel­oped that. So we can see these things com­ing but we don’t tend to pre­empt or react to them, and this is the great­est crime. We could talk about cli­mate change, or the debt in the United States, nuclear dis­as­ter in Fukushima which no one’s talk­ing about in the news any­more but it’s ongo­ing it’s gonna be ongo­ing for hun­dreds of years.

So, my first look at what was caus­ing mod­ern con­ster­na­tion was to go back in his­to­ry and look at the Mayan civ­i­liza­tion, the Romans, the Great Khmer Empire, the Ming Dynasty, and to say what were the ear­li­est symp­toms? Not the event that his­to­ri­ans and archae­ol­o­gists and every­body are argu­ing about caused their ulti­mate destruc­tion, but what were the ear­li­est symp­toms?

And it turns out that every one of those civ­i­liza­tions showed a ear­ly sign that progress was mov­ing ahead of the actu­al evo­lu­tion of the human being. There’s a point in time in which the speed of com­plex­i­ty begins to accel­er­ate, and the prob­lems that a civ­i­liza­tion has to deal with exceed the cog­ni­tive capa­bil­i­ties that we’ve thus far evolved. And when that hap­pens we begin to fall behind. And there are three symp­toms that show that we’ve reached that cog­ni­tive thresh­old, if you will.

Institutions that rep­re­sent the bedrock of soci­ety become grid­locked and par­a­lyzed. And they know what the great­est dan­gers are, but they become unable to act on them even though they have solu­tions. Now, the exam­ples that I use in my book are drought. We see these things, we have the data, we know they’re com­ing, and we can do things. We could start build­ing reser­voirs like crazy. We could build saline plants. We could do all kinds of things—have peo­ple build indi­vid­ual cis­terns to col­lect water in their homes. But we’re just not react­ing. Again, the danger’s too far out.

The sec­ond symp­tom that begins to show up is much more dis­turb­ing. In every one of these civ­i­liza­tions we see there’s a mass con­fu­sion between what is an empir­i­cal fact and what is a belief, an unproven belief.

The last sign is that there’s col­lapse. Because once you make pub­lic pol­i­cy based on unproven beliefs, col­lapse is the next thing that hap­pens to you.

And so a lot of peo­ple are very dis­turbed when they read my book, because they real­ize we’re sort of in the mid­dle of it. We have 1.5 bil­lion read­ings of the Earth’s sur­face tem­per­a­ture. We know it’s going up. And yet you can look at a vari­ety of sur­veys and some­where between…I don’t know, 60 to 70% of the American pub­lic does not believe in cli­mate change.

Aengus Anderson: So that’s inter­est­ing…

Costa: There you go.

Anderson: Yeah, because I mean in here, right, we’ve got empir­i­cal data, and there’s a dis­crep­an­cy with pol­i­cy and with belief. In the Roman case it’s like they don’t even know what’s hap­pen­ing out on the fron­tier. Is it dif­fer­ent now, that we actu­al­ly have much bet­ter objec­tive evi­dence than ear­li­er civ­i­liza­tions [crosstalk] would’ve had?

Costa: I don’t think it is. And I’ll tell you why, because for every study that you find on the Internet, you can find more that dis­prove it. So in a cli­mate like that, with peo­ple work­ing two and three jobs to put food on the table for their fam­i­lies and every­thing, is it any won­der that they’re very con­fused?

Look at the cur­rent health­care pro­gram. Obamacare. I don’t know of a sin­gle per­son who under­stands Obamacare. I don’t—and I talk to the smartest peo­ple in the world. You know I have a radio pro­gram. I am at the high­est lev­el, the most intel­li­gent peo­ple, the lead­ers of our coun­try and the globe. And they’re all con­fused. They need inter­preters to inter­pret what it might mean. When I have peo­ple say­ing that gee, our tax code used to be 400 pages twenty‐five years ago, and now it’s 70,000 pages.

What we need to under­stand is, com­plex­i­ty favors the wealthy. Because when General Electric looks at a 70,000-page tax code, they go hire a build­ing of tax lawyers to fig­ure out how to legal­ly not pay one cent in tax­es. Whereas the farm work­er or the per­son that’s a clerk at a retail store is either fill­ing out their own form, the short form, or going to H&R Block and pay­ing fifty bucks. And they’re gonna pay the max­i­mum tax.

So the more com­plex things get, the more there is an unequal dis­tri­b­u­tion of knowl­edge and exper­tise. And in the end it favors those who can go get the best sur­geons, the best doc­tors, that can work through the com­plex­i­ty; the best tax lawyers.

Anderson: So in this case com­plex­i­ty is a use­ful tool that actu­al­ly is not being used to say, solve social prob­lems but is being used to per­pet­u­ate a social sys­tem?

Costa: I would say that com­plex­i­ty cre­ates a high fail­ure rate envi­ron­ment. What is com­plex­i­ty? Let’s just start there. A lot of peo­ple talk about com­plex­i­ty; I love the def­i­n­i­tion that comes out of Harvard University. That there are more wrong choic­es than there are right ones. And the num­ber of wrong choic­es and rela­tion­ships are expo­nen­tial­ly grow­ing at a faster rate than the num­ber of right ones. So over a peri­od of time, your odds of pick­ing the right solu­tion, the right choice, are get­ting worse and worse.

And right now, when peo­ple like the for­mer CEO of Google, Eric Schmidt, for exam­ple, he says that we gen­er­at­ed as much data since the dawn of humankind to year 2003 as we gen­er­ate every forty‐eight hours. So every two days, we’re pro­duc­ing as much research, data, infor­ma­tion, as we did from the dawn of mankind to year 2003. Even if the data was there that would allow you to make a bet­ter deci­sion, you aren’t going to get to it. And so now we’ve got to come to terms with that.

You asked me about the Romans that didn’t have enough infor­ma­tion. We’re on the oth­er end of the spec­trum. Having a mil­lion choic­es is the same as hav­ing none. Your brain can’t sort through a mil­lion. Our brains are designed to solve prob­lems like find­ing lost lug­gage at the air­port. That’s what the human brain to this point in time is designed to do. It is not designed to go through 200,000 apps on my cell phone.

Anderson: Why this com­plex­i­ty? I’m think­ing of the Eric Schmidt exam­ple. And we’ve looked at this mas­sive rise in infor­ma­tion pro­duc­tion so recent­ly. And I think of the qual­i­ty of life, and what I was doing before 2003, which seems much the same as what I’m doing now. I mean, I don’t think that the qual­i­ty of life has dras­ti­cal­ly changed. So what are we gain­ing? Why do we pro­duce all of this com­plex­i­ty if it’s not actu­al­ly for some­thing real­ly dras­tic in terms of the way we actu­al­ly live in the world?

Costa: Well, that’s a very deep ques­tion. And it’s where my research is tak­ing me. So, I’m a lit­tle reluc­tant to speak about the why. I can only speak about what. And then I didn’t want to write a book that was doom and gloom, so before I pub­lished my book I had to spend half the time of my research say­ing do we have any­thing that these ancient civ­i­liza­tions didn’t have that could cir­cum­vent a neg­a­tive out­come? The great­est evo­lu­tion­ary asset that humans have, we are by far the great­est liv­ing organ­ism at being able to do thought exper­i­ments about the future out­comes of things and then be able to take an action in the present to pre­empt those neg­a­tive out­comes. And if you think about that as a sur­vival advan­tage, that’s the great­est advan­tage you could pos­si­bly have. We’ve squan­dered that.

Anderson: Well that’s what I was going to ask you, if we’ve repeat­ed­ly squan­dered that. You’ve prob­a­bly read Joseph Tainter’s work?

Costa: Yes, of course.

Anderson: Okay, well I inter­viewed him ear­li­er in this project and we were talk­ing about kind of the great sine wave of rise and fall and rise and fall. And for him there’s real­ly no way out. We are always in that cycle. And giv­en that we’ve done it so many times, what makes you opti­mistic that we are capa­ble of pre­empt­ing any­thing?

Costa: Because we have the bio­log­i­cal capa­bil­i­ty. What has caused human beings to rise to the top of the liv­ing pyra­mid is that we are the great­est novelty‐seeking organ­ism. If you think about it there’s been no machine that required con­stant stim­u­la­tion and new­ness as the human organ­ism, and I believe that going back to pre­his­toric times, we will see that that con­tin­ues to be what fuels progress. It’s our com­pul­sion to seek out new and then to devel­op new sys­tems, new tech­nol­o­gy, new process­es, new gov­ern­ment. The human brain is now adapt­ing to mas­sive stim­u­la­tion, and that’s where the last half of my book, I spent most of my time with neu­ro­sci­en­tists try­ing to fig­ure out if there was some­thing going on in the human brain that wasn’t on our radar before. How are we cog­ni­tive­ly adapt­ing to greater lev­els of com­plex­i­ty?

And it turns out that there’s a third form of problem‐solving. We’re all famil­iar with the left side of the brain and how it uses decon­struc­tion to find lost lug­gage at the air­port. You know, that exam­ple. And the right side of the brain uses more of a syn­the­sis kind of thing. I’m talk­ing to you and sud­den­ly I notice there’s a lit­tle sweat above your lip, and I think you’re lying.

But every now and again, when you’re deal­ing with prob­lems that are way above your pay grade, a lit­tle part of the human brain called the ASTG will light up about 300 mil­lisec­onds before you’ll have a spon­ta­neous insight. It’s what we call an aha moment.” And this is very sig­nif­i­cant. Because pri­or to this, we kind of made these aha moments folk­lore. Like Newton sit­ting under a tree and an apple fell on his head. Or Archimedes sits in a tub and the water flows over and he dis­cov­ers dis­place­ment the­o­ry. We kind of made these genius moments kind of about eclec­tic weird peo­ple that had spon­ta­neous inven­tions and dis­cov­er­ies.

But no, it’s actu­al­ly a process in the human brain. But it looks to be a process that’s extreme­ly tax­ing. We can see that aux­il­iary func­tions in the brain sort of start to slow down, and the brain doesn’t want to pay atten­tion to any­thing else, almost as though it’s going into a med­i­ta­tive state. We see the ASTG light up like a Christmas tree and then all of a sud­den a per­son will blurt out an answer. And there’s no ques­tion, it’s 100% cor­rect. And not only that, when you go to inter­view the per­son and say, Well how did you come up with that?” they go, I don’t know.” There’s no trace. There’s no sto­ry.

Now, hav­ing said, that if you’ve nev­er tak­en a physics course, you aren’t going to sud­den­ly come up with some insight in physics. It depends on the con­tent that you have in your brain. So if that’s the case, then the key is you have to be able to do things that allow the brain to load con­tent at a very rapid rate so that we can stim­u­late more insights. Insights are effec­tive­ly con­nect­ing the dots of two pieces of infor­ma­tion in a way that you’ve nev­er con­nect­ed them before.

Anderson: But you still need to have a lot more data. I mean, even if you’re say­ing that like, this thing con­nects data in a bet­ter way, solves prob­lems in a bet­ter way…seems like it’s prob­a­bly always been there, maybe it’s more active now just because of the loads we’re putting out in the site of sort of com­plex soci­ety. But how can we get enough in the brain to even solve prob­lems?

Costa: Well, it turns out there’s a lot of research going on right now about how the brain wants to learn. It’s inter­est­ing that you can put the brain where its steady state is learn­ing. Where it wants to learn and it’s not hap­py unless it’s learn­ing and accept­ing infor­ma­tion. We’ve learned that.

The sec­ond thing we’ve learned about the brain is that it wants to be warmed up (big sur­prise) before it accepts con­tent. So there are any num­ber of these brain fit­ness tools that seem to help peo­ple load con­tent much more effi­cient­ly and much more nat­u­ral­ly. For exam­ple Michael Merzenich, he is the neu­ro­sci­en­tist that did the orig­i­nal work on brain plas­tic­i­ty. That’s where the brain will rewire itself after an injury to allow oth­er parts of the brain to com­pen­sate for the injured part.

So, he became very con­cerned about well, how do we learn to learn? And what makes learn­ing easy? And so he devel­oped a series of tools that are video games and things that you play that warm up your mem­o­ry, warm up your frontal cor­tex, warm up all the parts of your brain. And then he began putting them out in schools and doing tests. And for exam­ple in Jacksonville County, they gave these warm‐up tools to school­child­ren. And they played them for fif­teen, twen­ty min­utes in the morn­ing. Those chil­dren had twice the aca­d­e­m­ic per­for­mance with no oth­er change in teach­ers, text­books, com­put­ers or any­thing, with­in three years.

So we do have tools that will allow us to make bet­ter deci­sions from a bio­log­i­cal stand­point. But we also have tools that are not bio­log­i­cal but they’re more strate­gic. If we’re fac­ing high fail­ure rate envi­ron­ments where there are more wrong choic­es than there are right ones, do we have any mod­els? And we do have mod­els. Venture cap­i­tal is a mod­el that I talk a lot about. A lot of peo­ple think ven­ture cap­i­tal­ists are experts at suc­cess. Well actu­al­ly they’re experts at fail­ure. For every hun­dred com­pa­nies they invest in, they only expect maybe ten or fif­teen to do well. The oth­ers they expect to not per­form. And yet, every ven­ture cap­i­tal­ist I know know very wealthy and very suc­cess­ful. So there is an exam­ple of a high fail­ure rate envi­ron­ment where they under­stand that no mat­ter how much due dili­gence they do, they can’t call it any bet­ter. To get to the solu­tion, you may have 85% waste, but 15% will pay off and those pay­offs will be so large that they’ll dwarf the waste and the loss.

So we also have strate­gic tools. So I don’t want to just lim­it it to brain fit­ness and adding data and being able to prompt insight.

Anderson: Even if we opti­mize the brain to its very best point, it seems like the lev­el of com­plex­i­ty we’re still deal­ing with is daunt­ing and also seems like it’s a social tool. Like, there’s always going to be more com­plex­i­ty as long as GE needs a longer tax code to make sure that they can evade and you can’t. Because we’re gen­er­at­ing the com­plex­i­ty our­selves in a lot of ways, is it some­thing that we don’t even want to catch up to? I mean, is it a detri­ment to the peo­ple who are cre­at­ing com­plex­i­ty to have us start sift­ing through it?

Costa: [long pause] I’m not a con­spir­a­cy per­son, so I should get that out.

Anderson: Yeah, I didn’t—

Costa: So I don’t believe that there’s a bunch of peo­ple in a room, a dark room with dim lights, say­ing—

Anderson: Going, Let have com­plex­i­ty!”

Costa: —let’s make real­ly com­plex—

Anderson: No, and I don’t mean to insin­u­ate that.

Costa: —so we can grab all the mon­ey and all the pow­er.

Anderson: But it is…it does…I mean, it works, right? For The Conversation, I’ve been look­ing at form­ing it into a non­prof­it. It is sim­ply too com­plex to be worth my while.

Costa: I couldn’t agree with you more.

Anderson: I’m sure you’ve expe­ri­enced this.

Costa: Anybody think­ing about becom­ing a non­prof­it [crosstalk] should give it up.

Anderson: Shouldn’t do it. Yeah! It’s just not worth it. If you’re the size of the Sierra Club and you’ve got resources, then maybe it works, but.

Costa: And then you’re deal­ing with the amount of mon­ey you spent to become a non­prof­it that could have gone toward rem­e­dy­ing the—

Anderson: Doing some­thing.

Costa: Doing some­thing, actu­al­ly hav­ing a result.

Anderson: Right. And that seems like that’s a lit­tle tee­ny slice of com­plex­i­ty. It’s not con­spir­a­to­r­i­al, at all. I mean, it’s evolved over time as people’ve swin­dled prob­a­bly tax‐exempt sta­tus over and over and over again, and the non­prof­it code got longer and longer. So noth­ing con­spir­a­to­r­i­al there and yet it is…grow­ing.

Costa: Yeah. If you ask me what makes me opti­mistic, it’s the fact that when you have the infor­ma­tion and the blue­print and you can look for the ear­li­est symp­toms, you have an oppor­tu­ni­ty to pre­empt and act pre­ven­ta­tive­ly. So if we want to pre­vent col­lapse we have to under­stand well, what is it I should be wor­ried about?

And look, 156 years ago when Charles Darwin dis­cov­ered evo­lu­tion, we made a wrong turn. We couldn’t rec­on­cile our reli­gious beliefs with the timetable of evo­lu­tion. Because when you strip it all down, dinosaurs would have eat­en Adam and Eve. And so every­one said, Well…no. We have to reject that we’re part of the liv­ing world.”

And despite the fact that guys like Watson and Crick dis­cov­ered the actu­al mechan­ics of evo­lu­tion? and nat­ur­al his­to­ry muse­ums all over the world are filled with phys­i­cal evi­dence, evo­lu­tion today is still prob­a­bly the most con­tro­ver­sial word next to abor­tion.” The minute you say evo­lu­tion” you could see peo­ple going, Oh no no no no. We don’t want to talk about that because that means we’re god­less.” Since when is man’s rela­tion­ship to the nat­ur­al world a god­less thing? The minute we part­ed from the nat­ur­al world, we denied our lim­i­ta­tions.

Anderson: That wasn’t a depar­ture that hap­pened with Darwin, that was a… You know, we’d always seen our­selves, or at least in a lot of like Abrahamic faiths, as very sep­a­rate from the nat­ur­al world. You know, there are oth­er faith tra­di­tions where we’re inte­grat­ed into it. But it seems like Darwin was a moment where those tra­di­tion sort of reassert­ed them­selves, but pre­served a sense that we are sep­a­rate, that we act upon the world…for our use.

Costa: Well, here’s why it was sig­nif­i­cant. We only have two bas­kets we can draw from, knowl­edge and unproven belief. So when knowl­edge drops off, it’s a free‐for‐all. But when empir­i­cal infor­ma­tion comes for­ward, and you reject that in favor of unproven belief, you’ve got a prob­lem.

Anderson: But are we [crosstalk] a ratio­nal ani­mal?

Costa: Because you’re reject­ing knowl­edge. You’re reject­ing knowl­edge. And when Charles Darwin dis­cov­ered evo­lu­tion, every­thing changed. Modern eco­nom­ics is evo­lu­tion. Gradualism in economies is based on the the­o­ry of evo­lu­tion. Political the­o­ry is based on evo­lu­tion. We based every­thing else on evo­lu­tion and yet we reject it. And so when we sep­a­rat­ed from the nat­ur­al world, we reject­ed the idea that we are teth­ered to the rate that we can adapt.

Anderson: I was talk­ing to an eco­nom­ics blog­ger the oth­er week, a guy named Charles Hugh Smith. We were talk­ing about abun­dance, and I was ask­ing him you know, do you think that in sit­u­a­tions of abun­dance we have a low­ered incen­tive to use the rea­son­ing part of our brain because the sta­tus quo is doing things well enough and we can exist in a lit­tle bit more of a fan­ta­sy world, and it’s only in sit­u­a­tions of scarci­ty when we real­ly have to put all the fan­ta­sy aside, we have to look hard at facts, and make seri­ous deci­sions because there are peo­ple dying or we are starv­ing or some­thing like that. Do you think the cri­sis we’re deal­ing with now is just one that, we can see all this stuff but…we’re pret­ty com­fy.

Costa: So let’s buy into that. Let’s say that’s true. Abundance makes us a lit­tle more relaxed, a lit­tle lazier, a lit­tle less ambi­tious, a lit­tle less dili­gent. We still have our ratio­nal mind. If we know that about our­selves, then we can put into place com­pen­sato­ry behav­iors.

Let me give you an exam­ple about myself. I’ve noticed that when I start to get a lot of mon­ey in my check­ing account, I’m more pre­dis­posed to buy stuff. So, since I know that about myself, when I get to a cer­tain lev­el of my check­ing account, I imme­di­ate­ly go down and move that mon­ey into a CD where I can­not touch it. That’s my com­pen­sato­ry behav­ior. So, your ratio­nal mind can over­ride these pre­dis­po­si­tions. The key is—

Anderson: Well, yours can.

Costa: —to know what they are. If abun­dance is cre­at­ing a lack­adaisi­cal atti­tude, then get rid of the abun­dance.

Anderson: I think what I’m curi­ous about there, though, is you’ve got to deal with a lot of peo­ple who may nev­er accept sci­ence, who may nev­er want to over­ride their pre­dis­po­si­tions, but are also equal­ly enti­tled to a vote.

Costa: Well, let’s just start with the big pic­ture and drill down. The big pic­ture is you can go to any coun­try, in the entire world, and they all have the same prob­lems. They’re fight­ing with man­ic swings in finan­cial mar­kets. They’re fight­ing with jobs cre­ation. Immigration. Terrorism. Climate change. Clean water issues. They all have the same prob­lems, every gov­ern­ment.

Now, they all have dif­fer­ent polit­i­cal sys­tems. And they all have dif­fer­ent economies. So if our prob­lems were eco­nom­ic, or polit­i­cal, they would have dif­fer­ent prob­lems. Our prob­lems are not eco­nom­ic or polit­i­cal. So, any­body lis­ten­ing to this, the lightbulb’s got­ta go off. If every­body is hav­ing the same prob­lem, that’s my def­i­n­i­tion of a species‐wide prob­lem.

Now let’s drill down to you. What are you not hap­py with? Are you car­ry­ing around twen­ty extra pounds? Would you like to know why you go for the piz­za instead of the sal­ad? Would you like to know why that allowed your ances­tors to sur­vive? The envi­ron­ment changes faster than biol­o­gy. And most peo­ple don’t know that. They don’t know why they’re com­pelled to act in that way but I’m gonna tell ya, if we go back to that, sud­den­ly the guilt, the blame, the shame is gone. Because that big bat that you beat your­self up with because you’re a shopa­holic, or you’re fat, or you have a mis­tress and you keep cheat­ing on your wife, you may not under­stand the bio­log­i­cal impe­tus for that.

Now that doesn’t excuse it, by the way. As you know, I’m a big believ­er that you use your ratio­nal mind to devel­op a com­pen­sato­ry behav­ior. In the same way that when your brain is strug­gling to accept data, you can use brain fit­ness.

Anderson: But these are all for pret­ty sim­ple prob­lems; per­son­al prob­lems of indi­vid­ual free will. But when it comes to say, some­thing real­ly sys­temic and dif­fuse, where not only do you need to have a com­pen­sato­ry behav­ior, but you need to have it in regards to a sys­tem with more vari­ables than you can under­stand, like cli­mate. In case like that, is it pos­si­ble to do the same sort of com­pen­sato­ry tricks with your mind?

Costa: We have to always go to the ratio­nal evi­dence. And we have to make those deci­sions on pub­lic pol­i­cy based on the empir­i­cal data that we have avail­able and not allow our­selves to get trapped like the Mayans did and allow pub­lic pol­i­cy to be forged on beliefs.

Anderson: You know, a big part of my con­ver­sa­tion with George Lakoff was him say­ing that look, we’re not real­ly reason‐driven. You know, and so when I think about our con­ver­sa­tion, how do we tell peo­ple, You need to have these com­pen­sato­ry sys­tems.” I won­der, how many peo­ple are just nev­er going to hear that?

Costa: I don’t think that’s my prob­lem. [laughs] Sorry. I’m not out to beat any­body up and say, You’ve got­ta lis­ten to me.” I think I have some good ideas some good obser­va­tions. I hope that they’re help­ful, and they’re pre­scrip­tive in some way. But I believe cer­tain­ty is the ene­my of knowl­edge. And to the extent that peo­ple think I’m so cer­tain that they should just lis­ten to me, they’re lis­ten­ing to anoth­er Jim Jones or anoth­er Bernie Madoff—no, you’re a crit­i­cal thinker. You have to take what I’ve told you and you have to eval­u­ate it. That is an indi­vid­ual deci­sion. I’m not try­ing to con­vince any­body, I’m just try­ing to say, This is what we have, this is what we’ve done in the past. This is what we could do. We seem to have some options on the table. What do you think?”

We should all be open to being wrong, at all times, and yet pas­sion­ate about what it is that we think we know at any par­tic­u­lar time. It’s pos­si­ble to be both.

Anderson: I mean, these sys­tems, whether it’s the envi­ron­ment or the econ­o­my, you can get a lot of empir­i­cal data but we don’t under­stand them inside and out. I’ve had peo­ple in this project, neo­prim­i­tivists, talk about you just need to go back, you know. Every time you make tech­no­log­i­cal changes, you keep rolling the dice and rolling the dice, and it seems like what we’re talk­ing about here are more intel­li­gent ways to roll the dice or ways to roll the dice. But by always run­ning for­ward with this stuff, you just keep run­ning into sce­nar­ios where as you said ear­li­er with your def­i­n­i­tion of com­plex­i­ty you just have more and more bad choic­es. Is it just too much of a gam­ble? Is it a bad roll?

Costa: You know, you can tell I’m a cheer­ful per­son for hav­ing writ­ten about col­lapse and peo­ple say, You know, your per­son­al­i­ty doesn’t match your study.” And I get that a lot. And that’s because I’m around neu­ro­sci­en­tists that are mak­ing dis­cov­er­ies about how we’re adapt­ing. I’m around peo­ple that are devel­op­ing brain fit­ness tools…

Not too long ago I had an oppor­tu­ni­ty to watch IBM’s com­put­er Watson. Watson was a com­put­er that many peo­ple maybe saw on Jeopardy. They got the Watson com­put­er to sit between the two biggest jeop­ardy cham­pi­ons, then Watson won. But what was real­ly inter­est­ing is IBM took that and said, We want to put this in an ER. Start load­ing all the med­ical data,” which is basi­cal­ly dou­bling every four to five years. There’s prob­a­bly nowhere where com­plex­i­ty plays a big­ger role than a strange patient who you have no records on is rushed into the ER and you have a mat­ter of sec­onds to make a deci­sion about which pro­ce­dures in which orders need to be done.

And they put Watson in I believe it’s Boston, Mass. so that Watson could take nor­mal English from any­body who was on the ER floor, and Watson had all the med­ical infor­ma­tion cur­rent, and would spit out what pro­ce­dures, in order, should be done, and what data would allow wants and to improve his rec­om­men­da­tions by 36% or 21%. And if you clicked on why Watson are you telling me to do this” it would give you the back­ground. It would say, I am mak­ing this rec­om­men­da­tion based on the fol­low­ing.” And it would do this in nor­mal English.

Now, there’s an appli­ca­tion of tech­nol­o­gy that off­sets com­plex­i­ty. That allows the data now to be pri­or­i­tized and used in real time in a mat­ter of picosec­onds, and do some­thing that humans can’t do. It’s a com­pen­sato­ry tech­nol­o­gy. And I believe that there’s room for an entire cat­e­go­ry of tech­nolo­gies that are com­pen­sato­ry.

Anderson: But when it comes to real­ly big things like the econ­o­my or the envi­ron­ment, are you into a dif­fer­ent scale of sys­tem entire­ly, where com­pen­sato­ry sys­tem— There’s no com­pen­sato­ry sys­tem that solves car­bon emis­sions because ulti­mate­ly what it would demand is that you stop emit­ting car­bon. And you can say well, that’s a sim­ple solu­tion but it demands a mul­ti­tude of changes for which there is no answer on the ground and we don’t actu­al­ly know how to struc­ture an econ­o­my that wouldn’t do that with­out cre­at­ing big finan­cial prob­lems. Can tech­nol­o­gy solve that?

Costa: We have to start with acknowl­edg­ing that we’re a bio­log­i­cal enti­ty and that there is no sur­viv­ing with­out clean water, clean air, food, love. Now let’s move to these com­plex sys­tems and say what tech­nol­o­gy is com­pen­sato­ry and will allow me to bridge the gap between the slow­ness of evo­lu­tion, my emo­tions, my desire to eat glazed donuts? What tech­nolo­gies can I use to bridge that gap that are friend­ly to the human organ­ism?”

So, we can com­plain about sys­temic prob­lems but if we’re coop­er­at­ing with those sys­temic prob­lems… Let me give you an exam­ple: the cur­rent debt. How many peo­ple are car­ry­ing cred­it card bal­ances? How did you get those cred­it card bal­ances? You spent more than you earned. So, now you say, Well gosh, you know, we’re going into infla­tion, and I can’t buy a house, and I can’t this and that,” and I’m say­ing to myself well, it might be a com­plex, sys­temic prob­lem but what was your par­tic­i­pa­tion in it, and what com­pen­sato­ry behav­ior did you put into place to keep your­self from par­tic­i­pat­ing and fuel­ing that com­plex sys­tem?

Anderson: At the same time, there are sys­tems where you can’t opt out, right? So, to live in the soci­ety, just based on its phys­i­cal foot­print, you’ve got to own a car. Your car’s got to be insured. You know, you kin­da can’t play by the rules and for a lot of peo­ple that drops them right into debt. So there’s this inter­est­ing larg­er social frame­work where even if you know the com­pen­sato­ry things that you need to do, you’re damned if you do and damned if you don’t.

Costa: I think that it starts with very small deci­sions that you can come to peace with in your­self. And that when more and more peo­ple engage in com­pen­sato­ry behav­iors, the sys­tem begins to change. Because what caus­es a sys­tem to change is crit­i­cal mass.

Anderson: If we know that we can take these com­pen­sato­ry mea­sures, why not make the com­pen­sato­ry mea­sure decel­er­at­ing all the tech­nol­o­gy? Wouldn’t it be awful­ly sim­ple to live in a much sim­pler soci­ety, know­ing that that suits our biol­o­gy bet­ter?

Costa: We don’t go back­wards in com­plex­i­ty. There’s nobody that wants to go grow their own food.

Anderson: [laugh­ing] I’ve actu­al­ly known a lot of peo­ple who do.

Costa: Well, I’m just say­ing the vast major­i­ty of peo­ple are not say­ing, I can’t wait to become a farmer—”

Anderson: Right.

Costa: “—and I can’t wait to wash my clothes at the riv­er.” And I became fas­ci­nat­ed. As a social sci­en­tist I don’t under­stand what our com­pul­sion for progress is. Why do we want—

Anderson: Right, or how we even define progress, right? [crosstalk] Because is more stuff progress?

Costa: Well, progress to me is more effi­cient, faster, and more. This is the pan­dem­ic addic­tion.

Anderson: I mean, in that case doesn’t it feel like Tainter is kin­da right? We’re just gonna keep like, putting our fin­ger in that bee­hive until we final­ly get stung instead of hon­ey.

Costa: Tainter is not wrong… But, Tainter is in some ways fatal­is­tic. Not fatal­is­tic as in we’re all going to die” but fatal­ist as in—

Anderson: Fatalistic in that there will be a reduc­tion in com­plex­i­ty.

Costa: —that there is always a— We always revert to sim­pler sys­tems that we’re designed to man­age. So, we’re not going to go back­wards, but we do have to use tech­nol­o­gy in a way that is com­pen­sato­ry for what humans are designed to do and not designed to do.

Anderson: In a way, if things con­tin­ue grow­ing expo­nen­tial­ly more com­plex and we were able, some­how, to keep fol­low­ing that tra­jec­to­ry, would it just raise the stakes so high that even­tu­al­ly we’d [crosstalk] face some­thing worse?

Costa: What do you sup­pose will hap­pen if we have a mas­sive finan­cial col­lapse in the United States?

Anderson: I think we have a stam­pede men­tal­i­ty. I think peo­ple would have seen many many many decades of sta­bil­i­ty com­ing to an end, and I think a lot of things we take as ratio­nal behav­ior would leave peo­ple, quick­ly.

Costa: Mm hm. Well, I have a friend of mine who’s an econ­o­mist and he says anar­chy is just four missed meals away.

Anderson: What’s his name?

Costa: John Sumser. And he may be right. You know, maybe we revert to our prim­i­tive self. I believe what’ll hap­pen is that we will revert to sys­tems that we all under­stand. Slowly, farm­ers mar­kets would open up, and some­body would roll up with a tire and say, I got a tire, I noticed your truck need­ed a new tire, and I’ll trade you four sacks of pota­toes.”

Anderson: Mm hm. Here’s kind of the ele­phant in the room when I think about this: pop­u­la­tion. It’s his­tor­i­cal­ly unprece­dent­ed, we know that. Can it only be car­ried by the real­ly com­plex sys­tems, and is what makes col­lapse scary the notion that if we go back to sys­tems we can under­stand, we have to go back to a num­ber of peo­ple who can be sup­port­ed by those sys­tems? Because I’ve had a lot of peo­ple go, Collapse won’t be bad, there’s some­thing bet­ter on the oth­er side,” but I always wonder…who dies?

Costa: Well it’s…you’re right. You described it as the ele­phant in the room, and yes, we are over­pop­u­lat­ed. Now, why has every mea­sure to con­trol pop­u­la­tion failed? It’s not because we don’t have birth con­trol. It’s because the strongest two dri­ves in nature are what? The dri­ve to sur­vive and the dri­ve to prop­a­gate. So sud­den­ly now we’re going, Don’t. prop­a­gate.” And if you’re going to prop­a­gate make sure you go down to the drug store and get a con­dom and you know— And we’re putting all the steps in and then they don’t work.

We’ve got to get in touch with what we are. You’re right. We’re over­pop­u­lat­ing. It’s not a big sur­prise. It’s not going to end. And when it does end it’s going to end bad­ly. It’ll be a pan­dem­ic virus. Wars are tak­ing care of some of this. Climate change will take care of some of it. We’re going to see mas­sive droughts, mas­sive changes in weath­er. Starvation that nobody can devel­op enough food for.

Anderson: Is that some­thing that we can use rea­son to over­ride?

Costa: Well of course you can use rea­son to over­ride that.

Anderson: You think so? Because it doesn’t seem like it hap­pens.

Costa: We’re not deal­ing with the sever­i­ty of the dri­ve. We’re act­ing like the dri­ve to have sex is like the dri­ve to…eat a ham­burg­er this after­noon. It’s not at the same lev­el.

Anderson: So almost the chal­lenge of our time, then, is to have a real con­ver­sa­tion about what we actu­al­ly are as an organ­ism.

Costa: Absolutely, and I think that when we have these com­pul­sions, we can deal with them. We have to say yes, this is how we are designed. Let’s cre­ate out­lets for that. And not be in denial about it, you know. A lot of this is just our puri­tan­i­cal upbring­ings and our reli­gious upbring­ings, you know. I don’t know.

You know, I have a lot of feel­ings about where we’re head­ed, and I think where we’re head­ed is an envi­ron­ment that is not friend­ly to the human organ­ism itself. Because we don’t under­stand what the human organ­ism is and what it needs. And so if you head down that road yes, Joseph Tainter’s right. We’re going to go up and down, and our sur­vival will be elas­tic. I believe that we have a great asset that has tak­en mil­lions of years to develop—lets use it.

Anderson: So you think we can break Tainter’s cycle, in a way.

Costa: Absolutely, because I think neu­ro­science is teach­ing us how we think, how we learn. What gives me opti­mism is that our great­est asset is pre­emp­tion. Is the abil­i­ty to look ahead, and then take an action now to either min­i­mize a neg­a­tive out­come or avert it alto­geth­er. And that is what I want us to look at.


Aengus Anderson: So there we go. Preemption. That is the idea that this conversation comes back to again and again and again. The idea that the rational part of your brain can sort of govern and manage the emotional part of your brain. There a lot of different ways that people have dealt with reason this project. A lot of people talk about the intelligence and the necessity of emotions, that's going to actually play in our next conversation with Kim Stanley Robinson. But this one's very much about how reason needs to govern the unruly emotions.

Neil Prendergast: So many of our interviews have taken up this idea and—in different ways of course, but I think one thing that we've always looked at is, well then how would that mechanism of this sort of rationalist thought filter into society so that it's actually something that's working at that…well, really just societal level.

Anderson: The mechanism… I think that's the word you used, mechanism, that was something that I tried to go after a lot in here. Now, I'm still a little bit unclear on like, what is the mechanism. You know, something that really intrigued me was the way that this sort of telescoped up and down between the macro level, like we need to have compensatory measures on a bigger level. Like, here's how you avoid climate change through compensatory measures. But also how it really went down to kind of the personal, self-help level. Here's a compensatory measure that keeps you from the doughnut.

Prendergast: Right. Right.

Anderson: It was kind of jarring in a way because I was like, wait. Here we're talking about climate change and here we're talking about…doughnut lust.

Prendergast: Right.

Anderson: But like, for her, they're both pieces of the same psychological problem.

Prendergast: Right. You think about doughnut lust, I think the thing that stood out for me was sex and hamburgers. [both laugh]

Anderson: You know…

Prendergast: I wasn't sure which, actually. But you know, I think it's interesting because it does map onto some of the earlier conversations, ones in this newer set that we're doing right now. In particular George Lakoff.

Anderson: Mm hm.

Prendergast: And you know, I'm thinking here of the terminology used there between systemic causation and sort of individual causation.

Anderson: Right.

Prendergast: And it seemed like you guys are swimming in the same waters in this conversation.

Anderson: In the same waters…? Perhaps. Because we're talking about the same…themes. But boy there are some really stark differences. Just to go back to the idea of reason and emotion. I mean, Lakoff would say like, "There's a mind. And the reason/emotion divide is silly. They're so interwoven."

I felt that something I got out of the conversation with Lakoff—and I'm not sure if this is exactly what he would have wanted to impart. But something I got out of it was there are different types of reason. Totally different types of reason. And so, what seems totally irrational to one person could be absolutely rational to another.

Prendergast: Right.

Anderson: And I feel like, when Rebecca's talking about reason, there is sort of like, some ontological reason that's out there that everyone has a part of.

Prendergast: I think that they also have different implications, too. So for example, with Lakoff it really seemed like in his mind we should understand systemic causation better, because we should try to change the system. And I know from his other work that's what he's trying to do.

But it really seemed that Costa was going in a different direction. I kind of want to go back to an early part of the interview that you did, where she was talking I think about diet? And she said well, once you know that there's these systemic reasons for maybe why it's difficult to lose weight if you're trying to do that, then you said yourself, "Oh, well I why don't feel so morally bad for not being able to do it."

And I think then Lakoff would probably say, "Well then okay, let's figure out why society has shaped our environment so that it's difficult to do this, and let's play the policy game of figuring out how we can change the environment so that it's easier to lose weight, perhaps."

Anderson: Right.

Prendergast: And she seems to just sort of stay over here in the, "Well, now I feel better about it, and maybe I'll make a different individual decision." But I didn't see that loop back to the political.

Anderson: Yeah, and that's a huge difference, right. I mean, I felt like talking to Lakoff, everything is about the political. With Costa…it's kind of the individual inheriting all of this biology, and then grappling with that, and then in some small way maybe that makes larger change. But the bigger systemic change? isn't really her interest. And she says that at one point, where she says you know, "I'm not out trying to like, beat people over the head and change their minds. What I really want is I just want to say what I know, and then let people sort of determine for themselves." So again, very much an individualistic notion.

Prendergast: I think at the very least we could say you know, she certainly is an indication that people today are very very interested in thinking about human biology as they try to at least address the problems of the day.

Anderson: We could talk about this forever, as with any of these things. But something that I was sort of left thinking about was, where are the political teeth in this? How does it really create meaningful, directed change? Like, what are the prescriptive parts of it beyond the kind of self-help aspects? For me this really crystallized when we were talking about complexity as a social tool.

And Rebecca says you know, GE can hire a building full of tax lawyers, and so they can navigate complexity, right. Complexity's a system that benefits them. And I wanted to know, well in that case will complexity never go away because there are a lot of rich people who benefit from it? It's like a filter that keeps them in an upper echelon of society. And it felt like she didn't want to go there. She's like, "Well, I'm not thinking conspiratorially." But of course, I don't think that has to be a conspiratorial thing. It's just good business. But I felt like that was something that she didn't want to…touch. If you'd thrown an idea like that at Lakoff? he could've received it in a whole variety of ways. But you know he woulda gone straight into the politics of it and been like, "Well yeah, here's some entrenched power interests. Here's how this helps them. Here's how we take that power back."

That was Rebecca Costa, recorded on June 14th, 2013 in Santa Cruz, California.

Further Reference

This interview at the Conversation web site, with project notes, comments, and taxonomic organization specific to The Conversation.


Help Support Open Transcripts

If you found this useful or interesting, please consider supporting the project monthly at Patreon or once via Square Cash, or even just sharing the link. Thanks.