I’d like to start with two of my favorite quotes. Arthur C. Clark said, Any suf­fi­cient­ly advanced tech­nol­o­gy is indis­tin­guish­able from mag­ic.” And Gibson said, The future is already here—it’s just not very even­ly dis­trib­uted.” And when I syn­the­size those two quotes, my take­away is that there are magi­cians amongst us.

So, what does mag­ic look like today? It looks like me being able to put an elec­trode on my fore­head, con­trolled by an iPhone in my right hand. And then three min­utes into the pro­gram, where that iPhone is con­trol­ling elec­tri­cal impuls­es into my skull to relax me, I had so much trou­ble putting words togeth­er that I felt like I was about a six pack deep into drink­ing.

But this is about the future of war. I want you to imag­ine the abil­i­ty to take a few drops of some­one’s blood, to com­bine that with some cog­ni­tive and some phys­i­cal test­ing, and to be able to fig­ure out where that per­son is going to be most effec­tive in your mil­i­tary. And by most effec­tive I don’t just mean how good they are at per­form­ing. I mean they’re going to be hap­pi­er there, too. Because if they’re not hap­py and they leave you lose a lot of effec­tive­ness. And then once you’ve put that per­son in a place where their skill set is opti­mal, you’re going to be able to enhance that beyond what they come in with.

So why do I believe this? Here’s data that shows the cor­re­la­tion between the ratio of two phys­i­o­log­i­cal vari­ables, DHEA sul­fate, which is a steroid hor­mone pre­cur­sor, and cor­ti­sol, which is a stress hor­mone. And the ratio between those two phys­i­o­log­i­cal fac­tors alone allows you to pick up 37% of the vari­abil­i­ty in per­for­mance dur­ing a spe­cial oper­a­tions task.

And that same tech­nol­o­gy that I was using in the first pic­ture, here is now on an air­man. And what you see from the graph on the left side is that peo­ple using tran­scra­nial direct cur­rent stim­u­la­tion (AKA elec­tric­i­ty at a very low lev­el going through your skull and chang­ing the acti­va­tion pat­tern of the neu­rons in your brain) were able to focus on a fair­ly chal­leng­ing task of watch­ing and track­ing air­craft on a screen with­out los­ing any per­for­mance for forty min­utes, when peo­ple with the place­bo lost 5% of their per­for­mance every ten min­utes. What would that mean for some­one dri­ving down a street try­ing to scan for IEDs?

But today we only think about some­one sleep­ing a lit­tle less, some­body run­ning a lit­tle longer with­out fatigue. I want to take this a lot fur­ther. And I’m think­ing about the abil­i­ty to use phar­ma­ceu­ti­cals and elec­tro­ceu­ti­cals (so elec­tric­i­ty, like I was tak­ing about before) to enhance learn­ing to such a degree, and to enable unit cohe­sion to come togeth­er to such a degree, you could train a mil­i­tary force to high lev­els of capa­bil­i­ty with­in weeks or months instead of years or decades. What would that do for strate­gic sur­prise?

I’m think­ing a lot about research in autism in oth­er areas. And autism, by the way, is a dis­or­der where peo­ple have tremen­dous dif­fi­cul­ty under­stand­ing why oth­ers think and act the way they do. Now imag­ine if we were able to use that same type research, or one of our adver­saries did, to enhance the abil­i­ty to think like your adver­sary. What would that do for psy­cho­log­i­cal oper­a­tions, for decep­tion, for influ­ence?

And I’m think­ing a lot today about how you could improve the gen­er­al pur­pose force to match soft lev­els of per­for­mance. But if you just use the gen­er­al pur­pose force that was bet­ter in the same way, I don’t think it would get you that far. But if you add new CONOPS and orga­ni­za­tion­al struc­tures, break it into small­er units, swarm­ing tac­tics, I think you might end up with some­thing like an order of mag­ni­tude improve­ment in per­for­mance.

And General [Mark] Milley this morn­ing men­tioned that it’s not just the size of a force, but a small­er force could smoke a big­ger force if they’re better-equipped or better-trained. Well, part of the corol­lary to that may be a ques­tion of what they’re smok­ing.

And I’m think­ing about direct brain-to-brain com­mu­ni­ca­tion. There are cur­rent DARPA pro­grams using brain/machine inter­face tech­nol­o­gy try­ing to replace the mem­o­ry of our injured war fight­ers, which is a tremen­dous goal and under­tak­ing.

But if you can replace some­one’s mem­o­ry, that means you can store ideas and trans­mit them into their brain. That means I can take those ideas and trans­mit them into some­one else’s brain. Imagine tak­ing a plan and being able to give it whole hog to some­one else to be able to think through it, to iter­ate on it, and to send it back to me. What would that do to the rate of inno­va­tion?

But I think any­body should ask, why should we believe that any of this is going to come, if for the last twen­ty years peo­ple have been say­ing human enhance­ment tech­nolo­gies are going to real­ly mat­ter? I think there’s three trends that mean we need to take a seri­ous look about what’s going to hap­pen in the next two to three decades.

So on the bot­tom there, neu­ro­science. Our sci­en­tif­ic knowl­edge in the past ten years alone has taught us that there [is a] new field, epi­ge­net­ics. Something that chem­i­cal mod­i­fi­ca­tions on your DNA that don’t change the let­ters in the code, can not only change the way you think and act but also can affect your chil­dren. We also have new tools, stay­ing on the neu­ro­science theme we’ve devel­oped tools that allow you to enable brain cells in ani­mals to be acti­vat­ed or deac­ti­vat­ed using light, opto­ge­net­ics. We have new scan­ning and imag­ine tech­nolo­gies.

And then move up to the top there. This is two sort of fake street signs that are in a start­up in Boston that has tens of mil­lions of dol­lars in fund­ing that’s devel­op­ing human enhance­ment tech­nolo­gies for com­mer­cial appli­ca­tions, and not for med­i­cine. So mon­ey’s flow­ing in, too.

So, I want to take you to an anal­o­gy with Moore’s Law, which we’ve heard men­tioned a cou­ple times today. In the first two decades of real micro­proces­sor and solid-state tran­sis­tor devel­op­ment, we saw about a 1000x improve­ment in the num­ber of tran­sis­tors you could get on a chip. But frankly, the impact on soci­ety was rel­a­tive­ly small from that advance. But in the sec­ond twen­ty years here, you only had about a 1000x greater improve­ment from where you start­ed in the mid­dle there, but of course it changed the world. Microprocessors drove the Internet, in this con­text C4ISR, etc. Precision strike com­plex.

So I think what we need to take is that not only is tech­nol­o­gy devel­op­ment non­lin­ear, but it can be fur­ther non­lin­ear when you’re look­ing at impact. What might this mean when we think about human enhance­ment tech­nolo­gies? Well, one of the oth­er things that hap­pened as those tran­sis­tor counts increased is the cost of each of those chips decreased. So here’s the rate of decrease in the cost to sequence a human genome, sequence all about three bil­lion base pairs. And you see it’s drop­ping faster than Moore’s Law.

But now I’m going put that on the same forty-year x axis I showed you before, and we’ll start at 2003, which was the the end of the Human Genome Project, the first sequenced human genome. And if you take that date as a start, we’re not even thir­teen years into that first twen­ty years.

But why should the time­line be sim­i­lar to com­put­ers? Well, I’d sug­gest that biol­o­gy is a lot more com­plex than computing…but now we have com­put­ers to help. I think the scale is about rel­a­tive­ly sim­i­lar. So I’m will­ing to make up fair­ly fal­si­fi­able pro­jec­tion and put my name on it. I think by 2043 (and this is my con­ser­v­a­tive fore­cast), human enhance­ment tech­nolo­gies are going to have about the same lev­el of impact as com­put­ers did five years ago. And frankly I think it’s com­ing in the 2030s.

Of course, the sec­ond corol­lary is why should we believe it? But also, will we mat­ter? Well, in this case I want to take you back to the last time humans were obso­lete. The F‑4 fight­er was built with­out an inter­nal gun because we were in the mis­sile age. And in the mis­sile age, humans could­n’t dog­fight well enough for it to mat­ter. They were just going to use sen­sors to see the oth­er air­craft and fire a mis­sile and would be done.

Well, we went into the Vietnam War and its amaz­ing exper­i­ment that the Air Force and the Navy both fly­ing the F‑4. In the first two to three years of the air war over North Vietnam, both have about a 2:1 exchange ratio. So, they’re killing, shoot­ing down two north Vietnamese air­craft for every­one we lose.

And this was a real sur­prise. We expect­ed to do a lot bet­ter. And so the Air Force goes back to draw­ing board and says the mis­siles are track­ing prop­er­ly, and they work on them. And the Navy goes back to the draw­ing board and says the mis­siles are track­ing prop­er­ly, but about part-way through that study, about halfway through that study, a gen­tle­man named Captain Ault says, Actually, a huge part of the prob­lem is that our pilots don’t know how to use them effec­tive­ly.” Because we’d stopped real­is­tic air com­bat train­ing because mis­siles were going to do it all for us and humans did­n’t mat­ter.

So, the Air Force goes back to the air war in 1970 after about a year’s break when we’re in nego­ti­a­tions with North Vietnam, and they only are able to main­tain their 2:1 exchange ratio. But with the imple­men­ta­tion of TOPGUN in 69 and 70, and this real­is­tic air com­bat train­ing and send­ing two peo­ple per unit for­ward to then train their units, the Navy goes to 12.5:1, almost an order of mag­ni­tude improve­ment in tac­ti­cal out­comes, because of human per­for­mance. And Anyone who does­n’t think train­ing is a true human per­for­mance tech­nol­o­gy, I sug­gest look­ing to the research on neu­ro­plas­tic­i­ty show­ing that train­ing train­ing changes the struc­ture your brain.

This field is often dis­cussed with a lot of con­cern in venues like this because of the eth­i­cal issues, and this fear that God for­bid some­body would force our mil­i­tary per­son­nel to use these tech­nolo­gies, what hap­pens if some­one gets hurt? Well, let me tell you some­thing. They want these tech­nolo­gies. My clients in Silicon Valley, in Boston, pay good mon­ey for enhance­ment because they see it as a com­pet­i­tive advan­tage. And when I give these talks in envi­ron­ments with a lot of junior mil­i­tary per­son­nel, the typ­i­cal respons is that some­body comes up to me at the end and says, If you ever need a research sub­ject, I promise I won’t tell any­one.” Now, that’s not how we do it. The mil­i­tary actu­al­ly has the strictest eth­i­cal and bureau­crat­ic restric­tions on doing this type of research any­where in the United States. It is hard­er to do this research in the mil­i­tary than any­where else.

This is data on Army per­son­nel tak­ing sup­ple­ments. And you can see that more than 50% of Army per­son­nel in this data were tak­ing at least one or more sup­ple­ments per week. And what’s real­ly inter­est­ing here is you’d expect that to drop off and have few­er and few­er as you see peo­ple tak­ing more and more. But actu­al­ly more peo­ple take five sup­ple­ments a week then take two to four. Which tells you you have a supe­ruser group. And any­one who spends time round mil­i­tary per­son­nel knows who these peo­ple are.

And on the eth­i­cal issue, I would sug­gest we con­sid­er refram­ing it. We require our mil­i­tary per­son­nel to go on the bat­tle­field with thir­ty, maybe even forty pounds of body armor. And we don’t just think or wor­ry, we know that it’s dam­ag­ing their knees and their back severe­ly, and these are life-long injuries. And at the same time, it’s decreas­ing their mobil­i­ty and there­fore parts of their oper­a­tional per­for­mance. But we have human enhance­ment tech­nolo­gies that range from drugs to stim­u­la­tion that are safer and enhance their oper­a­tional capa­bil­i­ty. So I think any­one who says that we should­n’t give those to peo­ple, I would say that’s a PR issue not an ethics issue. Thank you very much.

Further Reference

The Future of War Conference home page.

Mind+Matter, Andrew Herr's web site.


Help Support Open Transcripts

If you found this useful or interesting, please consider supporting the project monthly at Patreon or once via Square Cash, or even just sharing the link. Thanks.