Aengus Anderson: Well Micah Saul is not joining us today, but he will be back next time. So in the interim you're stuck with just me, Aengus Anderson…
Neil Prendergast: And me, Neil Prendergast.
Anderson: And this episode is my conversation with Rebecca Costa. She's a self-proclaimed sociobiologist, a former advertising executive, a radio talk show host—The Costa Report may be syndicated in your community—and most notably the author of The Watchman's Rattle A Radical New Theory of Collapse.
Prendergast: We first learned about her work early on in our episodes. She shares with us this notion that our own human biology has not kept up with the complexity of human society. And it seemed like an idea worth investigating a little bit more.
Rebecca Costa: If you were to ask me what the crisis in the present is, as an evolutionary biologist I have to go back millions of years and try to connect all the dots, going back to man as a single‐celled organism to present time, and saying what is it that is causing modern consternation? More importantly, is there a pattern? Has this happened before? Were there some ordinary people like you and I, shopkeepers in Rome, who were standing around and saying, “You know, our leaders don’t seem to be on top of our problems. They seem to be getting worse one generation after another.”
There had to have been some ordinary citizens who felt like things were getting away from them in some substantial way. And that’s what drove me to start looking at whether there were any early symptoms of demise of a civilization. Because we’re fascinated with the cataclysmic event that shoves us over the cliff. You know, it’s suspenseful, gets our adrenaline going, and in many ways we’re programmed to respond to short‐term fear. I throw a snake down on the ground and your your body floods with chemicals; it wants to flee or attack the snake.
But we’re not very good at looking at long-term threats which are substantially larger than a snake. We have no physiological response, at least to this point in time; we haven’t developed that. So we can see these things coming but we don’t tend to preempt or react to them, and this is the greatest crime. We could talk about climate change, or the debt in the United States, nuclear disaster in Fukushima which no one’s talking about in the news anymore but it’s ongoing it’s gonna be ongoing for hundreds of years.
So, my first look at what was causing modern consternation was to go back in history and look at the Mayan civilization, the Romans, the Great Khmer Empire, the Ming Dynasty, and to say what were the earliest symptoms? Not the event that historians and archaeologists and everybody are arguing about caused their ultimate destruction, but what were the earliest symptoms?
And it turns out that every one of those civilizations showed a early sign that progress was moving ahead of the actual evolution of the human being. There’s a point in time in which the speed of complexity begins to accelerate, and the problems that a civilization has to deal with exceed the cognitive capabilities that we’ve thus far evolved. And when that happens we begin to fall behind. And there are three symptoms that show that we’ve reached that cognitive threshold, if you will.
Institutions that represent the bedrock of society become gridlocked and paralyzed. And they know what the greatest dangers are, but they become unable to act on them even though they have solutions. Now, the examples that I use in my book are drought. We see these things, we have the data, we know they’re coming, and we can do things. We could start building reservoirs like crazy. We could build saline plants. We could do all kinds of things—have people build individual cisterns to collect water in their homes. But we’re just not reacting. Again, the danger’s too far out.
The second symptom that begins to show up is much more disturbing. In every one of these civilizations we see there’s a mass confusion between what is an empirical fact and what is a belief, an unproven belief.
The last sign is that there’s collapse. Because once you make public policy based on unproven beliefs, collapse is the next thing that happens to you.
And so a lot of people are very disturbed when they read my book, because they realize we’re sort of in the middle of it. We have 1.5 billion readings of the Earth’s surface temperature. We know it’s going up. And yet you can look at a variety of surveys and somewhere between…I don’t know, 60 to 70% of the American public does not believe in climate change.
Aengus Anderson: So that’s interesting…
Costa: There you go.
Anderson: Yeah, because I mean in here, right, we’ve got empirical data, and there’s a discrepancy with policy and with belief. In the Roman case it’s like they don’t even know what’s happening out on the frontier. Is it different now, that we actually have much better objective evidence than earlier civilizations [crosstalk] would’ve had?
Costa: I don’t think it is. And I’ll tell you why, because for every study that you find on the Internet, you can find more that disprove it. So in a climate like that, with people working two and three jobs to put food on the table for their families and everything, is it any wonder that they’re very confused?
Look at the current healthcare program. Obamacare. I don’t know of a single person who understands Obamacare. I don’t—and I talk to the smartest people in the world. You know I have a radio program. I am at the highest level, the most intelligent people, the leaders of our country and the globe. And they’re all confused. They need interpreters to interpret what it might mean. When I have people saying that gee, our tax code used to be 400 pages twenty‐five years ago, and now it’s 70,000 pages.
What we need to understand is, complexity favors the wealthy. Because when General Electric looks at a 70,000-page tax code, they go hire a building of tax lawyers to figure out how to legally not pay one cent in taxes. Whereas the farm worker or the person that’s a clerk at a retail store is either filling out their own form, the short form, or going to H&R Block and paying fifty bucks. And they’re gonna pay the maximum tax.
So the more complex things get, the more there is an unequal distribution of knowledge and expertise. And in the end it favors those who can go get the best surgeons, the best doctors, that can work through the complexity; the best tax lawyers.
Anderson: So in this case complexity is a useful tool that actually is not being used to say, solve social problems but is being used to perpetuate a social system?
Costa: I would say that complexity creates a high failure rate environment. What is complexity? Let’s just start there. A lot of people talk about complexity; I love the definition that comes out of Harvard University. That there are more wrong choices than there are right ones. And the number of wrong choices and relationships are exponentially growing at a faster rate than the number of right ones. So over a period of time, your odds of picking the right solution, the right choice, are getting worse and worse.
And right now, when people like the former CEO of Google, Eric Schmidt, for example, he says that we generated as much data since the dawn of humankind to year 2003 as we generate every forty‐eight hours. So every two days, we’re producing as much research, data, information, as we did from the dawn of mankind to year 2003. Even if the data was there that would allow you to make a better decision, you aren’t going to get to it. And so now we’ve got to come to terms with that.
You asked me about the Romans that didn’t have enough information. We’re on the other end of the spectrum. Having a million choices is the same as having none. Your brain can’t sort through a million. Our brains are designed to solve problems like finding lost luggage at the airport. That’s what the human brain to this point in time is designed to do. It is not designed to go through 200,000 apps on my cell phone.
Anderson: Why this complexity? I’m thinking of the Eric Schmidt example. And we’ve looked at this massive rise in information production so recently. And I think of the quality of life, and what I was doing before 2003, which seems much the same as what I’m doing now. I mean, I don’t think that the quality of life has drastically changed. So what are we gaining? Why do we produce all of this complexity if it’s not actually for something really drastic in terms of the way we actually live in the world?
Costa: Well, that’s a very deep question. And it’s where my research is taking me. So, I’m a little reluctant to speak about the why. I can only speak about what. And then I didn’t want to write a book that was doom and gloom, so before I published my book I had to spend half the time of my research saying do we have anything that these ancient civilizations didn’t have that could circumvent a negative outcome? The greatest evolutionary asset that humans have, we are by far the greatest living organism at being able to do thought experiments about the future outcomes of things and then be able to take an action in the present to preempt those negative outcomes. And if you think about that as a survival advantage, that’s the greatest advantage you could possibly have. We’ve squandered that.
Anderson: Well that’s what I was going to ask you, if we’ve repeatedly squandered that. You’ve probably read Joseph Tainter’s work?
Costa: Yes, of course.
Anderson: Okay, well I interviewed him earlier in this project and we were talking about kind of the great sine wave of rise and fall and rise and fall. And for him there’s really no way out. We are always in that cycle. And given that we’ve done it so many times, what makes you optimistic that we are capable of preempting anything?
Costa: Because we have the biological capability. What has caused human beings to rise to the top of the living pyramid is that we are the greatest novelty‐seeking organism. If you think about it there’s been no machine that required constant stimulation and newness as the human organism, and I believe that going back to prehistoric times, we will see that that continues to be what fuels progress. It’s our compulsion to seek out new and then to develop new systems, new technology, new processes, new government. The human brain is now adapting to massive stimulation, and that’s where the last half of my book, I spent most of my time with neuroscientists trying to figure out if there was something going on in the human brain that wasn’t on our radar before. How are we cognitively adapting to greater levels of complexity?
And it turns out that there’s a third form of problem‐solving. We’re all familiar with the left side of the brain and how it uses deconstruction to find lost luggage at the airport. You know, that example. And the right side of the brain uses more of a synthesis kind of thing. I’m talking to you and suddenly I notice there’s a little sweat above your lip, and I think you’re lying.
But every now and again, when you’re dealing with problems that are way above your pay grade, a little part of the human brain called the ASTG will light up about 300 milliseconds before you’ll have a spontaneous insight. It’s what we call an “aha moment.” And this is very significant. Because prior to this, we kind of made these aha moments folklore. Like Newton sitting under a tree and an apple fell on his head. Or Archimedes sits in a tub and the water flows over and he discovers displacement theory. We kind of made these genius moments kind of about eclectic weird people that had spontaneous inventions and discoveries.
But no, it’s actually a process in the human brain. But it looks to be a process that’s extremely taxing. We can see that auxiliary functions in the brain sort of start to slow down, and the brain doesn’t want to pay attention to anything else, almost as though it’s going into a meditative state. We see the ASTG light up like a Christmas tree and then all of a sudden a person will blurt out an answer. And there’s no question, it’s 100% correct. And not only that, when you go to interview the person and say, “Well how did you come up with that?” they go, “I don’t know.” There’s no trace. There’s no story.
Now, having said, that if you’ve never taken a physics course, you aren’t going to suddenly come up with some insight in physics. It depends on the content that you have in your brain. So if that’s the case, then the key is you have to be able to do things that allow the brain to load content at a very rapid rate so that we can stimulate more insights. Insights are effectively connecting the dots of two pieces of information in a way that you’ve never connected them before.
Anderson: But you still need to have a lot more data. I mean, even if you’re saying that like, this thing connects data in a better way, solves problems in a better way…seems like it’s probably always been there, maybe it’s more active now just because of the loads we’re putting out in the site of sort of complex society. But how can we get enough in the brain to even solve problems?
Costa: Well, it turns out there’s a lot of research going on right now about how the brain wants to learn. It’s interesting that you can put the brain where its steady state is learning. Where it wants to learn and it’s not happy unless it’s learning and accepting information. We’ve learned that.
The second thing we’ve learned about the brain is that it wants to be warmed up (big surprise) before it accepts content. So there are any number of these brain fitness tools that seem to help people load content much more efficiently and much more naturally. For example Michael Merzenich, he is the neuroscientist that did the original work on brain plasticity. That’s where the brain will rewire itself after an injury to allow other parts of the brain to compensate for the injured part.
So, he became very concerned about well, how do we learn to learn? And what makes learning easy? And so he developed a series of tools that are video games and things that you play that warm up your memory, warm up your frontal cortex, warm up all the parts of your brain. And then he began putting them out in schools and doing tests. And for example in Jacksonville County, they gave these warm‐up tools to schoolchildren. And they played them for fifteen, twenty minutes in the morning. Those children had twice the academic performance with no other change in teachers, textbooks, computers or anything, within three years.
So we do have tools that will allow us to make better decisions from a biological standpoint. But we also have tools that are not biological but they’re more strategic. If we’re facing high failure rate environments where there are more wrong choices than there are right ones, do we have any models? And we do have models. Venture capital is a model that I talk a lot about. A lot of people think venture capitalists are experts at success. Well actually they’re experts at failure. For every hundred companies they invest in, they only expect maybe ten or fifteen to do well. The others they expect to not perform. And yet, every venture capitalist I know know very wealthy and very successful. So there is an example of a high failure rate environment where they understand that no matter how much due diligence they do, they can’t call it any better. To get to the solution, you may have 85% waste, but 15% will pay off and those payoffs will be so large that they’ll dwarf the waste and the loss.
So we also have strategic tools. So I don’t want to just limit it to brain fitness and adding data and being able to prompt insight.
Anderson: Even if we optimize the brain to its very best point, it seems like the level of complexity we’re still dealing with is daunting and also seems like it’s a social tool. Like, there’s always going to be more complexity as long as GE needs a longer tax code to make sure that they can evade and you can’t. Because we’re generating the complexity ourselves in a lot of ways, is it something that we don’t even want to catch up to? I mean, is it a detriment to the people who are creating complexity to have us start sifting through it?
Costa: [long pause] I’m not a conspiracy person, so I should get that out.
Anderson: Yeah, I didn’t—
Costa: So I don’t believe that there’s a bunch of people in a room, a dark room with dim lights, saying—
Anderson: Going, “Let have complexity!”
Costa: —let’s make really complex—
Anderson: No, and I don’t mean to insinuate that.
Costa: —so we can grab all the money and all the power.
Anderson: But it is…it does…I mean, it works, right? For The Conversation, I’ve been looking at forming it into a nonprofit. It is simply too complex to be worth my while.
Costa: I couldn’t agree with you more.
Anderson: I’m sure you’ve experienced this.
Costa: Anybody thinking about becoming a nonprofit [crosstalk] should give it up.
Anderson: Shouldn’t do it. Yeah! It’s just not worth it. If you’re the size of the Sierra Club and you’ve got resources, then maybe it works, but.
Costa: And then you’re dealing with the amount of money you spent to become a nonprofit that could have gone toward remedying the—
Anderson: Doing something.
Costa: Doing something, actually having a result.
Anderson: Right. And that seems like that’s a little teeny slice of complexity. It’s not conspiratorial, at all. I mean, it’s evolved over time as people’ve swindled probably tax‐exempt status over and over and over again, and the nonprofit code got longer and longer. So nothing conspiratorial there and yet it is…growing.
Costa: Yeah. If you ask me what makes me optimistic, it’s the fact that when you have the information and the blueprint and you can look for the earliest symptoms, you have an opportunity to preempt and act preventatively. So if we want to prevent collapse we have to understand well, what is it I should be worried about?
And look, 156 years ago when Charles Darwin discovered evolution, we made a wrong turn. We couldn’t reconcile our religious beliefs with the timetable of evolution. Because when you strip it all down, dinosaurs would have eaten Adam and Eve. And so everyone said, “Well…no. We have to reject that we’re part of the living world.”
And despite the fact that guys like Watson and Crick discovered the actual mechanics of evolution? and natural history museums all over the world are filled with physical evidence, evolution today is still probably the most controversial word next to “abortion.” The minute you say “evolution” you could see people going, “Oh no no no no. We don’t want to talk about that because that means we’re godless.” Since when is man’s relationship to the natural world a godless thing? The minute we parted from the natural world, we denied our limitations.
Anderson: That wasn’t a departure that happened with Darwin, that was a… You know, we’d always seen ourselves, or at least in a lot of like Abrahamic faiths, as very separate from the natural world. You know, there are other faith traditions where we’re integrated into it. But it seems like Darwin was a moment where those tradition sort of reasserted themselves, but preserved a sense that we are separate, that we act upon the world…for our use.
Costa: Well, here’s why it was significant. We only have two baskets we can draw from, knowledge and unproven belief. So when knowledge drops off, it’s a free‐for‐all. But when empirical information comes forward, and you reject that in favor of unproven belief, you’ve got a problem.
Anderson: But are we [crosstalk] a rational animal?
Costa: Because you’re rejecting knowledge. You’re rejecting knowledge. And when Charles Darwin discovered evolution, everything changed. Modern economics is evolution. Gradualism in economies is based on the theory of evolution. Political theory is based on evolution. We based everything else on evolution and yet we reject it. And so when we separated from the natural world, we rejected the idea that we are tethered to the rate that we can adapt.
Anderson: I was talking to an economics blogger the other week, a guy named Charles Hugh Smith. We were talking about abundance, and I was asking him you know, do you think that in situations of abundance we have a lowered incentive to use the reasoning part of our brain because the status quo is doing things well enough and we can exist in a little bit more of a fantasy world, and it’s only in situations of scarcity when we really have to put all the fantasy aside, we have to look hard at facts, and make serious decisions because there are people dying or we are starving or something like that. Do you think the crisis we’re dealing with now is just one that, we can see all this stuff but…we’re pretty comfy.
Costa: So let’s buy into that. Let’s say that’s true. Abundance makes us a little more relaxed, a little lazier, a little less ambitious, a little less diligent. We still have our rational mind. If we know that about ourselves, then we can put into place compensatory behaviors.
Let me give you an example about myself. I’ve noticed that when I start to get a lot of money in my checking account, I’m more predisposed to buy stuff. So, since I know that about myself, when I get to a certain level of my checking account, I immediately go down and move that money into a CD where I cannot touch it. That’s my compensatory behavior. So, your rational mind can override these predispositions. The key is—
Anderson: Well, yours can.
Costa: —to know what they are. If abundance is creating a lackadaisical attitude, then get rid of the abundance.
Anderson: I think what I’m curious about there, though, is you’ve got to deal with a lot of people who may never accept science, who may never want to override their predispositions, but are also equally entitled to a vote.
Costa: Well, let’s just start with the big picture and drill down. The big picture is you can go to any country, in the entire world, and they all have the same problems. They’re fighting with manic swings in financial markets. They’re fighting with jobs creation. Immigration. Terrorism. Climate change. Clean water issues. They all have the same problems, every government.
Now, they all have different political systems. And they all have different economies. So if our problems were economic, or political, they would have different problems. Our problems are not economic or political. So, anybody listening to this, the lightbulb’s gotta go off. If everybody is having the same problem, that’s my definition of a species‐wide problem.
Now let’s drill down to you. What are you not happy with? Are you carrying around twenty extra pounds? Would you like to know why you go for the pizza instead of the salad? Would you like to know why that allowed your ancestors to survive? The environment changes faster than biology. And most people don’t know that. They don’t know why they’re compelled to act in that way but I’m gonna tell ya, if we go back to that, suddenly the guilt, the blame, the shame is gone. Because that big bat that you beat yourself up with because you’re a shopaholic, or you’re fat, or you have a mistress and you keep cheating on your wife, you may not understand the biological impetus for that.
Now that doesn’t excuse it, by the way. As you know, I’m a big believer that you use your rational mind to develop a compensatory behavior. In the same way that when your brain is struggling to accept data, you can use brain fitness.
Anderson: But these are all for pretty simple problems; personal problems of individual free will. But when it comes to say, something really systemic and diffuse, where not only do you need to have a compensatory behavior, but you need to have it in regards to a system with more variables than you can understand, like climate. In case like that, is it possible to do the same sort of compensatory tricks with your mind?
Costa: We have to always go to the rational evidence. And we have to make those decisions on public policy based on the empirical data that we have available and not allow ourselves to get trapped like the Mayans did and allow public policy to be forged on beliefs.
Anderson: You know, a big part of my conversation with George Lakoff was him saying that look, we’re not really reason‐driven. You know, and so when I think about our conversation, how do we tell people, “You need to have these compensatory systems.” I wonder, how many people are just never going to hear that?
Costa: I don’t think that’s my problem. [laughs] Sorry. I’m not out to beat anybody up and say, “You’ve gotta listen to me.” I think I have some good ideas some good observations. I hope that they’re helpful, and they’re prescriptive in some way. But I believe certainty is the enemy of knowledge. And to the extent that people think I’m so certain that they should just listen to me, they’re listening to another Jim Jones or another Bernie Madoff—no, you’re a critical thinker. You have to take what I’ve told you and you have to evaluate it. That is an individual decision. I’m not trying to convince anybody, I’m just trying to say, “This is what we have, this is what we’ve done in the past. This is what we could do. We seem to have some options on the table. What do you think?”
We should all be open to being wrong, at all times, and yet passionate about what it is that we think we know at any particular time. It’s possible to be both.
Anderson: I mean, these systems, whether it’s the environment or the economy, you can get a lot of empirical data but we don’t understand them inside and out. I’ve had people in this project, neoprimitivists, talk about you just need to go back, you know. Every time you make technological changes, you keep rolling the dice and rolling the dice, and it seems like what we’re talking about here are more intelligent ways to roll the dice or ways to roll the dice. But by always running forward with this stuff, you just keep running into scenarios where as you said earlier with your definition of complexity you just have more and more bad choices. Is it just too much of a gamble? Is it a bad roll?
Costa: You know, you can tell I’m a cheerful person for having written about collapse and people say, “You know, your personality doesn’t match your study.” And I get that a lot. And that’s because I’m around neuroscientists that are making discoveries about how we’re adapting. I’m around people that are developing brain fitness tools…
Not too long ago I had an opportunity to watch IBM’s computer Watson. Watson was a computer that many people maybe saw on Jeopardy. They got the Watson computer to sit between the two biggest jeopardy champions, then Watson won. But what was really interesting is IBM took that and said, “We want to put this in an ER. Start loading all the medical data,” which is basically doubling every four to five years. There’s probably nowhere where complexity plays a bigger role than a strange patient who you have no records on is rushed into the ER and you have a matter of seconds to make a decision about which procedures in which orders need to be done.
And they put Watson in I believe it’s Boston, Mass. so that Watson could take normal English from anybody who was on the ER floor, and Watson had all the medical information current, and would spit out what procedures, in order, should be done, and what data would allow wants and to improve his recommendations by 36% or 21%. And if you clicked on “why Watson are you telling me to do this” it would give you the background. It would say, “I am making this recommendation based on the following.” And it would do this in normal English.
Now, there’s an application of technology that offsets complexity. That allows the data now to be prioritized and used in real time in a matter of picoseconds, and do something that humans can’t do. It’s a compensatory technology. And I believe that there’s room for an entire category of technologies that are compensatory.
Anderson: But when it comes to really big things like the economy or the environment, are you into a different scale of system entirely, where compensatory system— There’s no compensatory system that solves carbon emissions because ultimately what it would demand is that you stop emitting carbon. And you can say well, that’s a simple solution but it demands a multitude of changes for which there is no answer on the ground and we don’t actually know how to structure an economy that wouldn’t do that without creating big financial problems. Can technology solve that?
Costa: We have to start with acknowledging that we’re a biological entity and that there is no surviving without clean water, clean air, food, love. Now let’s move to these complex systems and say “what technology is compensatory and will allow me to bridge the gap between the slowness of evolution, my emotions, my desire to eat glazed donuts? What technologies can I use to bridge that gap that are friendly to the human organism?”
So, we can complain about systemic problems but if we’re cooperating with those systemic problems… Let me give you an example: the current debt. How many people are carrying credit card balances? How did you get those credit card balances? You spent more than you earned. So, now you say, “Well gosh, you know, we’re going into inflation, and I can’t buy a house, and I can’t this and that,” and I’m saying to myself well, it might be a complex, systemic problem but what was your participation in it, and what compensatory behavior did you put into place to keep yourself from participating and fueling that complex system?
Anderson: At the same time, there are systems where you can’t opt out, right? So, to live in the society, just based on its physical footprint, you’ve got to own a car. Your car’s got to be insured. You know, you kinda can’t play by the rules and for a lot of people that drops them right into debt. So there’s this interesting larger social framework where even if you know the compensatory things that you need to do, you’re damned if you do and damned if you don’t.
Costa: I think that it starts with very small decisions that you can come to peace with in yourself. And that when more and more people engage in compensatory behaviors, the system begins to change. Because what causes a system to change is critical mass.
Anderson: If we know that we can take these compensatory measures, why not make the compensatory measure decelerating all the technology? Wouldn’t it be awfully simple to live in a much simpler society, knowing that that suits our biology better?
Costa: We don’t go backwards in complexity. There’s nobody that wants to go grow their own food.
Anderson: [laughing] I’ve actually known a lot of people who do.
Costa: Well, I’m just saying the vast majority of people are not saying, “I can’t wait to become a farmer—”
Costa: “—and I can’t wait to wash my clothes at the river.” And I became fascinated. As a social scientist I don’t understand what our compulsion for progress is. Why do we want—
Anderson: Right, or how we even define progress, right? [crosstalk] Because is more stuff progress?
Costa: Well, progress to me is more efficient, faster, and more. This is the pandemic addiction.
Anderson: I mean, in that case doesn’t it feel like Tainter is kinda right? We’re just gonna keep like, putting our finger in that beehive until we finally get stung instead of honey.
Costa: Tainter is not wrong… But, Tainter is in some ways fatalistic. Not fatalistic as in “we’re all going to die” but fatalist as in—
Anderson: Fatalistic in that there will be a reduction in complexity.
Costa: —that there is always a— We always revert to simpler systems that we’re designed to manage. So, we’re not going to go backwards, but we do have to use technology in a way that is compensatory for what humans are designed to do and not designed to do.
Anderson: In a way, if things continue growing exponentially more complex and we were able, somehow, to keep following that trajectory, would it just raise the stakes so high that eventually we’d [crosstalk] face something worse?
Costa: What do you suppose will happen if we have a massive financial collapse in the United States?
Anderson: I think we have a stampede mentality. I think people would have seen many many many decades of stability coming to an end, and I think a lot of things we take as rational behavior would leave people, quickly.
Costa: Mm hm. Well, I have a friend of mine who’s an economist and he says anarchy is just four missed meals away.
Anderson: What’s his name?
Costa: John Sumser. And he may be right. You know, maybe we revert to our primitive self. I believe what’ll happen is that we will revert to systems that we all understand. Slowly, farmers markets would open up, and somebody would roll up with a tire and say, “I got a tire, I noticed your truck needed a new tire, and I’ll trade you four sacks of potatoes.”
Anderson: Mm hm. Here’s kind of the elephant in the room when I think about this: population. It’s historically unprecedented, we know that. Can it only be carried by the really complex systems, and is what makes collapse scary the notion that if we go back to systems we can understand, we have to go back to a number of people who can be supported by those systems? Because I’ve had a lot of people go, “Collapse won’t be bad, there’s something better on the other side,” but I always wonder…who dies?
Costa: Well it’s…you’re right. You described it as the elephant in the room, and yes, we are overpopulated. Now, why has every measure to control population failed? It’s not because we don’t have birth control. It’s because the strongest two drives in nature are what? The drive to survive and the drive to propagate. So suddenly now we’re going, “Don’t. propagate.” And if you’re going to propagate make sure you go down to the drug store and get a condom and you know— And we’re putting all the steps in and then they don’t work.
We’ve got to get in touch with what we are. You’re right. We’re overpopulating. It’s not a big surprise. It’s not going to end. And when it does end it’s going to end badly. It’ll be a pandemic virus. Wars are taking care of some of this. Climate change will take care of some of it. We’re going to see massive droughts, massive changes in weather. Starvation that nobody can develop enough food for.
Anderson: Is that something that we can use reason to override?
Costa: Well of course you can use reason to override that.
Anderson: You think so? Because it doesn’t seem like it happens.
Costa: We’re not dealing with the severity of the drive. We’re acting like the drive to have sex is like the drive to…eat a hamburger this afternoon. It’s not at the same level.
Anderson: So almost the challenge of our time, then, is to have a real conversation about what we actually are as an organism.
Costa: Absolutely, and I think that when we have these compulsions, we can deal with them. We have to say yes, this is how we are designed. Let’s create outlets for that. And not be in denial about it, you know. A lot of this is just our puritanical upbringings and our religious upbringings, you know. I don’t know.
You know, I have a lot of feelings about where we’re headed, and I think where we’re headed is an environment that is not friendly to the human organism itself. Because we don’t understand what the human organism is and what it needs. And so if you head down that road yes, Joseph Tainter’s right. We’re going to go up and down, and our survival will be elastic. I believe that we have a great asset that has taken millions of years to develop—lets use it.
Anderson: So you think we can break Tainter’s cycle, in a way.
Costa: Absolutely, because I think neuroscience is teaching us how we think, how we learn. What gives me optimism is that our greatest asset is preemption. Is the ability to look ahead, and then take an action now to either minimize a negative outcome or avert it altogether. And that is what I want us to look at.
Aengus Anderson: So there we go. Preemption. That is the idea that this conversation comes back to again and again and again. The idea that the rational part of your brain can sort of govern and manage the emotional part of your brain. There a lot of different ways that people have dealt with reason this project. A lot of people talk about the intelligence and the necessity of emotions, that's going to actually play in our next conversation with Kim Stanley Robinson. But this one's very much about how reason needs to govern the unruly emotions.
Neil Prendergast: So many of our interviews have taken up this idea and—in different ways of course, but I think one thing that we've always looked at is, well then how would that mechanism of this sort of rationalist thought filter into society so that it's actually something that's working at that…well, really just societal level.
Anderson: The mechanism… I think that's the word you used, mechanism, that was something that I tried to go after a lot in here. Now, I'm still a little bit unclear on like, what is the mechanism. You know, something that really intrigued me was the way that this sort of telescoped up and down between the macro level, like we need to have compensatory measures on a bigger level. Like, here's how you avoid climate change through compensatory measures. But also how it really went down to kind of the personal, self-help level. Here's a compensatory measure that keeps you from the doughnut.
Prendergast: Right. Right.
Anderson: It was kind of jarring in a way because I was like, wait. Here we're talking about climate change and here we're talking about…doughnut lust.
Anderson: But like, for her, they're both pieces of the same psychological problem.
Prendergast: Right. You think about doughnut lust, I think the thing that stood out for me was sex and hamburgers. [both laugh]
Anderson: You know…
Prendergast: I wasn't sure which, actually. But you know, I think it's interesting because it does map onto some of the earlier conversations, ones in this newer set that we're doing right now. In particular George Lakoff.
Anderson: Mm hm.
Prendergast: And you know, I'm thinking here of the terminology used there between systemic causation and sort of individual causation.
Prendergast: And it seemed like you guys are swimming in the same waters in this conversation.
Anderson: In the same waters…? Perhaps. Because we're talking about the same…themes. But boy there are some really stark differences. Just to go back to the idea of reason and emotion. I mean, Lakoff would say like, "There's a mind. And the reason/emotion divide is silly. They're so interwoven."
I felt that something I got out of the conversation with Lakoff—and I'm not sure if this is exactly what he would have wanted to impart. But something I got out of it was there are different types of reason. Totally different types of reason. And so, what seems totally irrational to one person could be absolutely rational to another.
Anderson: And I feel like, when Rebecca's talking about reason, there is sort of like, some ontological reason that's out there that everyone has a part of.
Prendergast: I think that they also have different implications, too. So for example, with Lakoff it really seemed like in his mind we should understand systemic causation better, because we should try to change the system. And I know from his other work that's what he's trying to do.
But it really seemed that Costa was going in a different direction. I kind of want to go back to an early part of the interview that you did, where she was talking I think about diet? And she said well, once you know that there's these systemic reasons for maybe why it's difficult to lose weight if you're trying to do that, then you said yourself, "Oh, well I why don't feel so morally bad for not being able to do it."
And I think then Lakoff would probably say, "Well then okay, let's figure out why society has shaped our environment so that it's difficult to do this, and let's play the policy game of figuring out how we can change the environment so that it's easier to lose weight, perhaps."
Prendergast: And she seems to just sort of stay over here in the, "Well, now I feel better about it, and maybe I'll make a different individual decision." But I didn't see that loop back to the political.
Anderson: Yeah, and that's a huge difference, right. I mean, I felt like talking to Lakoff, everything is about the political. With Costa…it's kind of the individual inheriting all of this biology, and then grappling with that, and then in some small way maybe that makes larger change. But the bigger systemic change? isn't really her interest. And she says that at one point, where she says you know, "I'm not out trying to like, beat people over the head and change their minds. What I really want is I just want to say what I know, and then let people sort of determine for themselves." So again, very much an individualistic notion.
Prendergast: I think at the very least we could say you know, she certainly is an indication that people today are very very interested in thinking about human biology as they try to at least address the problems of the day.
Anderson: We could talk about this forever, as with any of these things. But something that I was sort of left thinking about was, where are the political teeth in this? How does it really create meaningful, directed change? Like, what are the prescriptive parts of it beyond the kind of self-help aspects? For me this really crystallized when we were talking about complexity as a social tool.
And Rebecca says you know, GE can hire a building full of tax lawyers, and so they can navigate complexity, right. Complexity's a system that benefits them. And I wanted to know, well in that case will complexity never go away because there are a lot of rich people who benefit from it? It's like a filter that keeps them in an upper echelon of society. And it felt like she didn't want to go there. She's like, "Well, I'm not thinking conspiratorially." But of course, I don't think that has to be a conspiratorial thing. It's just good business. But I felt like that was something that she didn't want to…touch. If you'd thrown an idea like that at Lakoff? he could've received it in a whole variety of ways. But you know he woulda gone straight into the politics of it and been like, "Well yeah, here's some entrenched power interests. Here's how this helps them. Here's how we take that power back."
That was Rebecca Costa, recorded on June 14th, 2013 in Santa Cruz, California.
This interview at the Conversation web site, with project notes, comments, and taxonomic organization specific to The Conversation.