Luke Robert Mason: You’re in for a real treat this evening. I am blessed to be able to wel­come Jaron Lanier to Virtual Futures. My name is Luke Robert Mason, and for those of you here for the first time, which is pret­ty much Jaron and nobody else, the Virtual Futures Conference occurred at the University of Warwick in the mid-90s, and to quote its cofounder it arose at a tip­ping point in the tech­nol­o­giza­tion of first-world cultures.

Now, whilst it was most often por­trayed as a techno-positivist fes­ti­val of accel­er­a­tionism towards a posthu­man future, the Glastonbury of cyber­cul­ture” as The Guardian put it, its actu­al aim hid­den behind the brushed steel, the sil­i­con, the jar­gon, the design­er drugs, the charis­mat­ic prophets and the tech­no par­ties was much more sober and much more urgent. What Virtual Futures did was try to cast a crit­i­cal eye the phe­nom­e­nal changes in how humans and non­hu­mans engage with emerg­ing sci­en­tif­ic the­o­ry and tech­no­log­i­cal devel­op­ment. This salon series com­pletes the conference’s aim to bury the 20th cen­tu­ry and begin work on the 21st. So, let’s begin.


Luke Robert Mason: For this crowd, Jaron Lanier needs no intro­duc­tion. Our dis­tin­guished guest is cred­it­ed as invent­ing the iPhone. Except it was­n’t the iPhone, the small smart com­put­ing device made by Apple, it was the EyePhone, or E‑Y-E-phone, quite lit­er­al­ly a phone for your eyes.

His new book Dawn of the New Everything tells the sto­ry of this device and the VR start­up that cre­at­ed it, VPL Research Inc., a com­pa­ny that Jaron found­ed in 1984. Anyone notice the trou­bling irony? Nineteen…eighty…four.

Often cred­it­ed as hav­ing coined the term vir­tu­al real­i­ty,” it is Jaron that we have to thank or per­haps chas­tise for the vir­tu­al insan­i­ty that’s been plagu­ing all cul­ture of late. As the VR indus­try promis­es enhanced worlds in which we’re all gam­ing with each oth­er, Jaron’s book reveals VR’s dan­ger­ous poten­tial in allow­ing us to game each oth­er. Or, be gamed by more prob­lem­at­ic exter­nal pow­er structures.

But this is not a dry book of dystopi­an tech­no­log­i­cal pre­dic­tions, it is hope­ful and it is hyper­linked. And when you read it you’ll know what I mean by that. This book is a man­i­festo for world­build­ing. And what’s so clear is that Jaron tru­ly and deeply under­stands what it real­ly takes to devel­op immer­sive and more impor­tant­ly, sat­is­fy­ing, expe­ri­ences in VR. And that’s not because he’s an accom­plished com­put­er sci­en­tist or tech­nol­o­gist, although those things are true, but it’s because he’s deeply con­nect­ed to what it means to be human.

So to help us under­stand the new every­thing, please put your hands togeth­er, stamp your feet, go wild, and join me in wel­com­ing Jaron Lanier to Virtual Futures.

Jaron Lanier: Hi! 

Mason: So Jaron, I want to start at the begin­ning of the end. I want to start in the 80s when you moved to Silicon Valley. Because when it comes to the Internet, we got what we want­ed but it was­n’t nec­es­sar­i­ly what we thought it was going to be. How did you envi­sion cyberspace?

Lanier: Oh gosh. Well, I mean… That’s quite a hell of a ques­tion, because it’s a, it’s a big thing. Cyberspace is a par­tic­u­lar word that was made up by Bill Gibson, who writes books. And around the time we actu­al­ly— [com­ment from audi­ence about using his micro­phone] Oh, god. Alright here, how about that? Is that good? Can you hear? 

Yeah, so around the time we incor­po­rat­ed VPL, Bill pub­lished a book called Neuromancer that I’m sure peo­ple remem­ber. And so the rule used to every­body has to come up with their own names, so there were a zil­lion names in cur­ren­cy for what we end­ed up call­ing vir­tu­al real­i­ty, which was my lit­tle ver­sion of it. 

And Bill… Oh God you know. Something I feel like I can’t ful­ly tell this sto­ry with­out his per­mis­sion because some of it’s very personal.

Mason: We’re among friends.

Lanier: Yeah. But I’ll leave out some of it. But he took not only— He was writ­ing in I think an extreme­ly impor­tant and won­der­ful dystopi­an tra­di­tion about this very thing. Much of it English, which I— How many peo­ple have read The Machine Stops by E.M. Forster? Oh, please…

Mason: That’s dis­ap­point­ing for [inaudi­ble].

Lanier: Oh come on, peo­ple. Come on. Get with it. 

Alright. So this was writ­ten some­thing like 110 years ago. And it’s a sci­ence fic­tion novel­la. And this is the fel­low you know who wrote Room with a View— Can you guys still hear me, by the way? Is this is close enough? Yeah? No?

Audience Member: Not very well, no.

Lanier: Here, I’ll just kiss this stu­pid piece of foam, for you. Is that dystopi­an enough for you? Is this what it was all for? 

Okay, so. So any­way. So I for­get if it was…it’s some­thing like 1907, 1908, 1909, some­thing like that. E.M. Forster writes this novel­la, and he describes a pop­u­la­tion addict­ed to these screens con­nect­ed to giant glob­al com­put­ers. And the screens are hexag­o­nal, sug­gest­ing that the peo­ple have become like a bee­hive, like a hive mind. And they talk about top­ics and they do lit­tle video calls. And they all get kind of lost and every­thing becomes a lit­tle unre­al, and you get the feel­ing that they’re all sub­ject to this pow­er struc­ture. And then there’s a crash. It’s not very reli­able machine. There’s a crash and all these peo­ple die, and they final­ly crawl out out of their cubi­cles and they see the sun, Oh my god, the sun. Reality.”

Anyway. This was 110 years ago. And in a way he nailed it. I mean, all of the dystopi­an sci­ence fic­tion lit­er­a­ture since then, whether it’s the feel­ies, or the Matrix, or so many oth­ers, are in a sense echoes of The Machine Stops, so you ought to read this thing. I mean if you’re inter­est­ed in this world.

And in a sense I thought Neuromancer was in the tra­di­tion. It was anoth­er… That The Machine Stops is such a pro­found work it actu­al­ly cre­at­ed its own lit­er­ary genre that con­tin­ues to this day and I thought Neuromancer was some­thing like a cross between William Gibson and E.M. Forster or some­thing like that. But it was also much more than that but that’s a sto­ry for anoth­er time.

Anyway. I used to argue with Bill about Neuromancer. Because he would send me scenes he was writ­ing. He’d been writ­ing short sto­ries in the same style before­hand. And at that time, my thing was vir­tu­al real­i­ty could either end up as the most beau­ti­ful medi­um that brings peo­ple togeth­er and helps bridge the ever-mysterious inter­per­son­al gap. It could be the great­est form for art ever. It could be this thing that makes imag­i­na­tion more val­ued. It could make it pos­si­ble to extend some of the more lumi­nous aspects of child­hood into life with­out los­ing them into the rest of one’s life. There were all these hopes I had for it. I hoped it could be a bea­con of empa­thy that would help peo­ple behave better.

But, it could also turn out to be the creepi­est thing of all time because it could be the ulti­mate Skinner box. Do you know what I mean by Skinner box? Everybody know that ref­er­ence? Okay. Because I nev­er know these days what any­body knows.

So it could be like this mind con­trol hor­ri­ble thing that could be real­ly awful. It could be the creepi­est inven­tion ever. And I thought it would flip one way or the oth­er and if we kind of set the right tone at the start, that might help set it on the pos­i­tive course. So I used to call up Bill and bug him. And I’d say, Bill, your stuff’s so creepy! You have to make it nicer!”

And I’m not real­ly good at doing voic­es but at that time he still had this incred­i­bly strong Tennessee Southern accent. It was like, Well Jaron, you know—” I can’t do it, but any­way he would say you know, I’m a writer! I write what I write. I can’t change it for this pro­gram of yours even though it’s very nice.”

But when we start­ed to get VR stuff work­ing around oh, I don’t know, 84-ish or some­thing like that, he would come by. Which was hard at that time. I don’t know if— He was stuck in Canada, because he’d gone there to avoid being draft­ed into Vietnam. So it was hard— His sit­u­a­tion was dif­fi­cult at that time. But any­way, we’d get togeth­er and he would say, You know, if I had it all to do over I would just work on this tech instead of being a writer.” 

And I’d say, Be a writer.” But then I said, Okay, but hey. If you want to come work, you know…”

And he was like, Uh…maybe I’ll be a writer,” after he saw the real­i­ties of Silicon Valley, where you work all night and you just com­plete­ly work your­self to the bone on and on and on. 

Anyway. So at that time so long ago, the way I thought about it is it could flip either way and what we had to do is just kind of say the right incan­ta­tions, just get it set on a good course. And where I am today on it— Is this answer­ing your ques­tion at all, or have I totally—

Mason: I’m enjoy­ing the sto­ries, so I’m going to let you con­tin­ue. But Jaron, the ques­tion I was going to ask you was with regards to the Internet itself. So cyber­space is what Gibson envi­sioned, but the Internet, do you believe went off course?

Lanier: Well, yeah. I mean, kin­da obvi­ous­ly. I mean… Let us count the ways. I mean, under the cur­rent Internet as it exists right now, before we even talk about our own sit­u­a­tion in the United States, I’ll men­tion that what’s called in the trade shit­post­ing” seems to’ve played a crit­i­cal role in foment­ing the Rohingya cri­sis, in desta­bi­liz­ing some parts of India, some oth­er issues in parts of Africa—particularly Sudan. And so there are peo­ple dying who do not have the ben­e­fit of an advanced soci­ety or sta­ble gov­ern­ment as a back­up. So out in the wild it’s absolute­ly deadly.

I could also talk about the absolute­ly extreme and unten­able wealth and pow­er con­cen­tra­tion that’s been engen­dered by the dig­i­tal net­works, that mir­rors late 19th-century Gilded Era’s but is prob­a­bly worse, and is absolute­ly unsus­tain­able and head­ed off a cliff. 

I could also men­tion the desta­bi­liza­tion of democ­ra­cies all over the world that before social net­work­ing as we know it showed up, we had thought that once coun­tries went demo­c­ra­t­ic they would tend to stay that way and per­fect it. And instead we see the United States, Turkey, Egypt, Austria…I mean who knows.

I hate to even say these names because I want to be wrong. Like I mean it’s hor­ri­fy­ing to even think of this. Just speak­ing of the United States, it’s a thing we nev­er expect­ed and it’s a ter­ror, and it’s…I don’t know where it’ll end. I don’t know how bad it’ll get. And it’s very clear­ly relat­ed to what’s hap­pened with dig­i­tal net­works. So I would describe it as an abject fail­ure. It’s…the pro­gram of my gen­er­a­tion of com­put­er sci­en­tists has failed. So…

Mason: Well, John Perry Barlow said in The Declaration of the Independence of Cyberspace that this should not be a con­struc­tion project. And yet it has become a pub­lic con­struc­tion project. Is it time, Jaron, for a demolition.

Lanier: Mm. Well, my take on it—oh god. This is a tough one. Barlow and I are on oppo­site sides of inter­pret­ing this, and I think it’s kind of bro­ken his heart and I don’t know what to do about that. We were kin­da back at the time, too.

Okay so here’s what I think hap­pened. What I think hap­pened… And this is not to lay blame on any­body, because I don’t think any­body real­ly knew for sure how this stuff would work. This was all rather mys­te­ri­ous and exper­i­men­tal. But what I think hap­pened it was, back in the 90s and into the turn of the cen­tu­ry, there was an extreme­ly strong sen­si­bil­i­ty that things should be free. That infor­ma­tion should be able to move with­out restric­tions relat­ed to pay­ment. There should be free music. Email should be free.

For instance, in the 90s there was this con­tro­ver­sy about whether some kind of tiny home­o­path­ic amount of postage might be good on mail. And the rea­son for it was very sim­ply— I mean, there was an imme­di­ate rea­son and there was a long-term rea­son. The imme­di­ate rea­son was that even the slight­est amount of postage would shut down spam, which is already a gigan­tic prob­lem, right. So it’s one thing if you spam like mil­lions of peo­ple, even a tiny amount of postage cost some mon­ey, right. 

But then the longer-term thing is that if there was some sort of com­merce built in from the start, then even­tu­al­ly when the robots start doing more and peo­ple take on new roles at least there’s a chance they could get paid instead of becom­ing wards of the state. So it holds out some kind of hope for that option which would­n’t be there if some kind of pay­ment was­n’t gen­er­al­ized, right.

So those were the two rea­sons. But this idea that things should be free total­ly swamped that. People were just very very like, sort of mil­i­tant­ly dog­mat­ic about it, and many still are. But here’s the prob­lem. We then cre­at­ed this kind of super-open Internet where every­thing can be free, noth­ing’s traced, you can make copies of any­thing. And I can get back to this whole idea of being able to make open copies of things just means that noth­ing has con­text any­more. You don’t know where any­thing came from and that— The orig­i­nal idea for social net­work­ing was to not make copies because copies are an inef­fi­cien­cy any­way and you just trace back to the orig­i­nal, both for pay­ment and for con­text so you would know what things are. 

[To audi­ence:] So when was the first design for a dig­i­tal net­work? What year? 1960

Audience Member: 1969?

Lanier: 1960. And it was by Ted Nelson at Harvard. Yeah. He did­n’t call it Xanadu yet, but he designed one in 60 and it was the first one. The pack­et switch idea pre­dat­ed it but there was­n’t an archi­tec­ture. And in that one, you did­n’t copy because copy­ing seemed like an uncon­scionable expense. Instead you traced back to ori­gins. But his rea­son­ing for that was extreme­ly sound, which is then every­thing has con­text and some­body can’t mis­rep­re­sent some­body else’s con­text and appro­pri­ate it. Plus if you want to have cap­i­tal­ism peo­ple can be paid. And if you don’t want cap­i­tal­ism you don’t have to, but it’s an option. So it increas­es your options. So that was his argu­ment in 1960, and so that was absolute­ly reject­ed by this feel­ing that things must be free. 

And this idea that things must be free had mer­it. I mean, I under­stand the argu­ments. There were a series of things that had hap­pened that… Well, I men­tioned Bill was avoid­ing the draft. There were a lot of peo­ple that felt that the abil­i­ty to hide was the most impor­tant thing ever in free­dom. And so they ignored oth­er things that might also be impor­tant. That’s the short answer to how it happened,

But any­way. So we cre­at­ed this net­work design where every­thing’s free, every­thing’s copy­able, con­text is lost. But then we thrust it for­ward into— And the we” is spe­cif­ic peo­ple, many of whom are very straight. Al Gore essen­tial­ly did invent it as a polit­i­cal project. Just to be clear that was actu­al­ly what hap­pened. You might not even know that con­tro­ver­sy, but— [inaudi­ble audi­ence com­ment] The pipes thing was some­body else’s com­ment. But any­way. That was a Bush comment. 

We thrust this open net­work into a cap­i­tal­ist con­text where the larg­er soci­ety was still in one where you had to pay for rent and where you were expect­ed to start cor­po­ra­tions, and if some­body put mon­ey into some­thing they want­ed a prof­it and all of this. And we made a very raw Internet that did­n’t do much. It did­n’t have iden­ti­ty mech­a­nisms, it did­n’t have pay­ment mech­a­nisms, it did­n’t have per­sis­tence mech­a­nisms, it did­n’t have…really much of any­thing. It was just this very raw thing.

So the ques­tion is what do you do? So even­tu­al­ly some­body invent­ed the World Wide Web on top of it, and… A British thing. How about that, yet anoth­er local thing. Tim Berners-Lee. Well, he did it in Switzerland, so. I don’t know. 

Mason: Well, we have cur­rent issues with Europe, but.

Lanier: No no, I would like to fin­ish. Sorry no, you don’t get anoth­er ques­tion yet. I need to fin­ish this. 

So here’s what hap­pened. If you tell peo­ple you’re going to have this super-open, absolute­ly non-commercial, money-free thing, but it has to sur­vive in this envi­ron­ment that’s based on mon­ey, where it has to make mon­ey, how does any­body square that cir­cle? How does any­body do any­thing? And so com­pa­nies like Google that came along, in my view were backed into a cor­ner. There was exact­ly one busi­ness plan avail­able to them, which was adver­tis­ing. And adver­tis­ing does­n’t sound so bad until you remem­ber what all of that cau­tion­ary lit­er­a­ture point­ed out for so many years (I should also men­tion Norbert Wiener, who’s an impor­tant fig­ure in cau­tion­ary lit­er­a­ture.), which is that in a cyber­net­ic struc­ture, if you have a com­put­er mea­sur­ing peo­ple and then pro­vid­ing feed­back based on that mea­sure­ment, you are no longer offer­ing them per­sua­sive com­mu­ni­ca­tion. Its like a rat in a Skinner box. You’re mod­i­fy­ing their behav­ior and you’re addict­ing them. You’re doing both of those things. And you’re doing it inevitably, irrev­o­ca­bly. And so essen­tial­ly what we said is the only thing you’re allowed to do on the Internet is build a behav­ior mod­i­fi­ca­tion empire, every­thing else is dis­al­lowed. So it was a project in a way of the left that cre­at­ed an author­i­tar­i­an Internet. And I think it back­fired hor­ri­bly and I think it was dis­as­trous. And that’s the thing that needs to be undone.

Mason: So to some degree, Jaron, it was log­i­cal that this would be the out­come. Why do you think we’re see­ing now the indi­vid­u­als who were there at the begin­ning start­ing to come out against some of these, as you call them, behav­ior mod­i­fi­ca­tion empires? Both Ev Williams, and only a week ago Sean Parker, have both tried to step up to be the Cassandra of Silicon Valley and go, Well, we knew this was going to happen.” 

Lanier: You know… I need to ask Sean about that. The day before he said this thing about how, Oh yeah, we inten­tion­al­ly set up this addic­tion loop,” there was a piece, an inter­view of me by Maureen Dowd in The New York Times about that mech­a­nism, and it’s pos­si­ble that it was tied.

But this has been an open secret. Everybody’s kind of known that this was hap­pen­ing. And I think peo­ple have to come out against it because the world’s being destroyed. It’s a mat­ter of sur­vival. I mean, it’s real­ly becom­ing so dangerous.

Mason: So then the ques­tion becomes how? So how do we divorce these plat­forms from the non­hu­man agency of cap­i­tal­ism which has mor­phed them into these prob­lem­at­ic enti­ties, these behav­ioral mod­i­fi­ca­tion empires, as you call them?

Lanier: Yeah… Well, see this is the trick, isn’t it? I’ll make a few obser­va­tions. One obser­va­tion is that if we want to live in a cap­i­tal­ist soci­ety and if some­body has a pro­gram that they think is bet­ter than cap­i­tal­ism, I’m not ide­o­log­i­cal­ly opposed to such a thing, it’s just not gonna hap­pen in three sec­onds. So, just for the moment in the short term we have to sur­vive in a cap­i­tal­ist soci­ety. And so we have to think about how there can be busi­ness­es that don’t rely on behav­ior modification. 

And for­tu­nate­ly there are a lot of them. I mean, for instance, um… Oh god, I don’t know where to begin. Of the giant tech com­pa­nies there’s only two that do it. Google and Facebook do it. Facebook more so. Amazon, Apple, Microsoft and many oth­ers actu­al­ly sell goods and ser­vices. And you might feel that they should be crit­i­cized for var­i­ous things, and you might very well be right that they should be. But not for that, alright. I mean, they might have lit­tle exper­i­ments in that direc­tion but they don’t really…that’s not their main thing. 

So it’s real­ly two big com­pa­nies and most­ly one big com­pa­ny, and then a few small­er com­pa­nies, that are fail­ing as busi­ness­es. I mean Twitter as a busi­ness is this real­ly wob­bly thing. So it should want to try dif­fer­ent busi­ness ideas. And it’s not hard to imag­ine what these might look like. So if there were like a hun­dred thou­sand com­pa­nies doing it that would be hard­er, but it’s real­ly kind of boils down to just a hand­ful that need to be changed. 

Could that be done by reg­u­la­tors? Maybe. I think what makes more sense is to try to just kin­da cajole the com­pa­nies, just say, What you’re doing is real­ly stu­pid. You’re decent peo­ple of good will. Just do some­thing different.” 

And the way they transform—like you could imag­ine a tran­si­tion— There are a bunch of dif­fer­ent paths, but the one that I think is the eas­i­est and makes the most sense is Facebook says, Hey you know, if you have pop­u­lar posts we’re gonna start pay­ing you. And then grad­u­al­ly we’re going to also start charg­ing you but a very low amount, and if you’re poor, noth­ing. But if you’re not poor at least a lit­tle tiny bit.” So this is get­ting back to like dig­i­tal postage. And we’re going to grad­u­al­ly start refus­ing peo­ple who want pay to manip­u­late you. And then we have a tar­get that in five years or what­ev­er it is, this thing will become a mon­e­tized social net­work instead of a manipulation-based one. And there’ll be no hid­den third par­ties who’re pay­ing to affect what you expe­ri­ence. And boom, done. And then Putin has to go cry in a corner.

Mason: So do you think where we are now is just a pass­ing fad? Do you think this will come to pass and we’ll look back on these twenty-five years where we end­ed up with social media and go, Ha, weren’t they so sil­ly to build it that way?”

Lanier: Well that’s cer­tain­ly what I want. I want this peri­od to be remem­bered like a bizarre bad dream that we passed through. I want this to be remem­bered like oth­er stu­pid things that’ve hap­pened as just this his­tor­i­cal peri­od that was just incred­i­bly strange that we try to teach kids about and won­der if we’re real­ly doing it well enough. I want it to be like that.

But I don’t know, though. I mean, this could get a lot worse before it gets bet­ter. I just— I don’t know where it’s going. It’s not clear how bad things are going to get in the US. But it’s bad. It’s real­ly bad. It’s scary. I mean, I live in Berkeley, California, and peri­od­i­cal­ly right wing demon­stra­tors come to try to pro­voke fights. And some­thing’s hap­pened which has nev­er hap­pened before, which is once in a while on these days when they’re com­ing and hop­ing for a fight, there’ll be these guys like in pick-ups dri­ving round and they’ll pre­tend to swerve at you if you look left­ist. And then they’ll cut back and just go off. But it’s a weird, scary thing. And peo­ple have start­ed just stay­ing in their homes. It’s like a thing that has nev­er hap­pened before. So it’s bad. It’s real­ly bad. And all these peo­ple live in this oth­er real­i­ty which was cre­at­ed by— If they’re old enough it was cre­at­ed on cable news, but this pop­u­la­tion it’s all social media.

Mason: Well let’s start talk­ing about oth­er real­i­ties. Because the sit­u­a­tion we’re in with the fake news and the way in which peo­ple’s per­cep­tion is manip­u­lat­ed by these empires is a form of vir­tu­al real­i­ty, is it not?

Lanier: No…

Mason: No?

Lanier: No, I mean I— Well, I mean lis­ten, I don’t own the term or any­thing so you have as much right to define it as any­one. But it cer­tain­ly isn’t… It does­n’t cor­re­spond to any of the ways I use it. If you want to use the term to refer to an instru­men­ta­tion strat­e­gy or some­thing like that, that’s okay. If you want to use it for mar­ket­ing your prod­uct or some­thing, I guess whatever. 

But if you want to talk about it in broad­er philo­soph­i­cal terms, what I hope we’re talk­ing about is a medi­um of per­son­al expres­sion that might be used by mys­te­ri­ous third par­ties but so far has­n’t been because it’s so nascent. I mean, what I hope is that this peri­od of the dark­ness of social media will help us sort this out before vir­tu­al real­i­ty becomes more com­mon­place. Because it’s going to be much more potent than this real­ly crude stuff like you know, Facebook on a phone, which is real­ly not much of any­thing com­pared to what’ll come, you know.

Hey by the way, you’re all smart, hip peo­ple, right? Will you delete your accounts?

Audience Member: Facebook?

Lanier: Yeah. Get rid of it, it’s stu­pid. You don’t need it. You think you need it but it’s an arti­fi­cial addic­tion. Just get rid of the stu­pid thing. Like come on.

Mason: Get rid of Facebook, but if you are fol­low­ing this con­ver­sa­tion on Twitter—

Lanier: Oh, Twitter too. Get rid of Twitter. Twitter’s real­ly stu­pid. Just stop it.

No, actu­al­ly— Let me say a cou­ple things about that. What hap­pens on Twitter and Facebook might be quite beau­ti­ful and amaz­ing. I’ll give you an exam­ple. In the US there was a move­ment that start­ed on social media called Black Lives Matter that brought aware­ness to a phe­nom­e­non— This is anoth­er thing, it was kind of an open secret and any­body who knew any­thing knew it was going on. But some­how it just shift­ed into some­thing that was an open open secret that peo­ple actu­al­ly talked about. It made it more real. And this was this hor­ri­ble phe­nom­e­non of unarmed black kids sud­den­ly get­ting killed after a traf­fic stop, over and over and over again, the police not being pros­e­cut­ed. And it was like this nation­al blood sport or some­thing, it was this hor­ri­ble thing.

So there was this move­ment to raise aware­ness about it and to try to reform police depart­ment. So, Black Twitter is a form of lit­er­a­ture. It’s a beau­ti­ful thing. It’s a legit­i­mate­ly extra­or­di­nary lit­er­ary phe­nom­e­non. And it was fun when like, Trump engages with it and they total­ly run rings around him, black Twitter users. So it’s cool.

But here’s the thing, though. At the same time that peo­ple are using Twitter for some­thing like Black Lives Matter, or cur­rent­ly for #MeToo, there’s this oth­er thing going on which is that the algo­rithms, with­out any evil genius direct­ing them, the algo­rithms are putting peo­ple into bins who were doing this. And then what they’re doing is they’re auto­mat­i­cal­ly with­out any evil genius direct­ing them, test­ing what it’s like when oth­er peo­ple react to peo­ple from those bins who are in say, Black Lives Matter or #MeToo or what­ev­er. And then the algo­rithms are nat­u­ral­ly tuned to fur­ther what’s called engage­ment,” which we might more prop­er­ly called addiction.

And so if there’s some­thing from Black Lives Matter or #MeToo that then upsets some oth­er group, the algo­rithms will all auto­mat­i­cal­ly opti­mize that to upset them as much as pos­si­ble, and vice ver­sa. So now the peo­ple are auto­mat­i­cal­ly, with­out any evil genius direct­ing them, form­ing them­selves into groups. And then what hap­pens is adver­tis­ers or sabo­teurs or weird bil­lion­aires with a stu­pid agen­da or you know, Russian infor­ma­tion warriors—whatever it is. They come along and they use the Facebook tool and they say, Oh, what can I buy? What can I buy?”

And then this thing is like super opti­mized. It’s a super bar­gain. So like, Yeah!” So they do that, then they push it even more, and it goes more and more. So the inter­est­ing thing is that Black Lives Matter, through its suc­cess, under­mined itself because of this struc­ture. It’s a dirty trick. It’s a dirty dirty trick. At the very moment that you’re suc­ceed­ing, you’re build­ing in your fail­ure. And you will suf­fer in the polls. You will cre­ate a soci­etal reac­tion against you, par­tic­u­lar­ly if you start out in a minor­i­ty posi­tion or in a rel­a­tive­ly unem­pow­ered posi­tion. It’s a fun­da­men­tal­ly impos­si­ble game.

Now, I hope that that’ll be proven some­what wrong in the American elec­tions. That it’s just so out­ra­geous that we’ll be able to gath­er enough steam, but I’m not sure. This mech­a­nism intrin­si­cal­ly undoes and inverts social progress. It’s a social progress invert­er, and it’s built in and there’s no way to fix it with­out chang­ing the busi­ness structure.

Mason: Well this time around, Jaron, real­i­ty is at stake because it may go into these vir­tu­al real­i­ty realms. And we look at things like Facebook social VR and I have to ask what did you think of that when you saw Oculus pur­chased by Facebook in 2014? Did you go, Great! They’re buy­ing these com­pa­nies. My dream will come true.” Or did you go, Oh god, no. Hell no.”

Lanier: Yeah. I had real­ly mixed feel­ings. I mean, I was kin­da hap­py in a way because it’s was… I mean, I like to see peo­ple enjoy VR. I like to see peo­ple feel that it’s worth invest­ing in. Oculus in par­tic­u­lar was stu­dents of a friend of mine who cre­at­ed a start­up out of his class, Mark Bolas? Are you aware of him? Mark Bolas was then a pro­fes­sor at University of Southern California. And his stu­dent projects includ­ed cre­at­ing the card­board view­er that Google adver­tis­es as Google Cardboard or what­ev­er they call it, and the Oculus Rift. And I thought you know, this is cool. Like, VR’s fash­ion­able and peo­ple like mon­ey so this’ll— I mean obvi­ous­ly I was enthused about that. I thought I thought then and I still believe that Facebook will change its busi­ness plan before it’s too late. So I still kin­da believe that, you know. I real­ly do.

Mason: Most peo­ple have their first vir­tu­al real­i­ty expe­ri­ence with the Oculus. They lose their VRginity to the Oculus Rift. But you lost your VRginity back in the 1980s. There was the ear­ly days when you were deal­ing with these very very clunky devices. What was it like to cre­ate those sorts of brand new vir­tu­al real­i­ty devices?

Lanier: Well I mean, just to be clear. Like the EyePhone… The bet­ter of the two EyePhone mod­els was every bit as good as a cur­rent Oculus Rift or Vive or some­thing. They were just super expen­sive. But yeah, we had to make them. We had to think about like how do you actu­al­ly man­u­fac­ture this thing? How would you… Nobody had ever made any­thing like that before. Nobody had made the wide angle opti­cal thing and fig­ured out how to mount it on a head and how to mount it on dif­fer­ent kinds of peo­ple’s heads, and just the whole busi­ness. We had to invent a prod­uct category.

Mason: And then you say in the book the won­der­ful thing about these big clunky pieces of hard­ware is it makes the VR more ethical.

Lanier: Yeah, absolute­ly. So the deal here— So what is the dif­fer­ence between a magi­cian and a char­la­tan? The dif­fer­ence is that the magi­cian announces the trick, right. And in fact you can even know how a trick works and appre­ci­ate a magi­cian even more­so, right? So the deceit isn’t the core of mag­ic, right, it’s the artistry.

And so I think the ques­tion is how do you announce the trick in vir­tu­al real­i­ty? You must announce the trick. And this incli­na­tion that many peo­ple seem to have to want to make the devices as invis­i­ble as pos­si­ble, or even to sort of just be in VR are all the time, strikes me as being both pre­pos­ter­ous and miss­ing the point, and also uneth­i­cal because it it then fails to announce. 

I call it pre­pos­ter­ous because like, to me VR is beau­ti­ful. Like, a well-crafted vir­tu­al world expe­ri­ence can be extra­or­di­nary. And then to say, Oh, this should just be on all the time,” it’s like some­body telling me, Oh, you like clas­si­cal music? We’ll just leave it on all the time.” And, Oh, you like wine? Well you should drink it alll the time all day long.” I mean it’s just like, not the way you treat some­thing you love. It just makes no sense at all to me.

Mason: A lot of peo­ple fear that vir­tu­al real­i­ty is going to dimin­ish their expe­ri­ence of the world but you believe that it’s going to height­en perception.

Lanier: Right. Well, back in the old days…I’m so old. But back like in the 80s, my favorite trick in giv­ing a VR demo was to put like a flower out on the table while some­body was in the VR expe­ri­ence. And then when they’d come out they’d look at this flower— You should try that. There’s this way that it sort of pops into hyper­re­al­i­ty because you start— You’ve giv­en your ner­vous sys­tem a chance to adjust to some­thing else, so it re-engages with this phys­i­cal world fresh­ened. And I think that’s the great­est joy of vir­tu­al real­i­ty, real­ly. That’s the very best thing there is.

Mason: But the sorts of VR expe­ri­ences that we’re get­ting today and the sorts of things that peo­ple think of vir­tu­al real­i­ty such as 360 video, that’s not real­ly VR, is it?

Lanier: Well, 360 video, which is anoth­er thing by the way we were doing at the time. We had a prod­uct called VideoSphere that achieved that back in the 80s. Although analog…tape…it was all very hard to do, but any­way. I think the spher­i­cal videos might turn into a genre of their own. Like that might be some­thing that per­sists but it’s impor­tant under­stand what it is and what it isn’t. I’m a lit­tle sad that it does­n’t have its own name and that it’s being called vir­tu­al real­i­ty because the prob­lem of course is that it’s not inter­ac­tive and it does­n’t— I think it real­ly does­n’t get to even the begin­ning of the core of the beau­ty of vir­tu­al reality.

But on the oth­er hand, some of them are very…they can be very good for what they are. And they can also be impor­tant. I mean I think as doc­u­men­tary and empathy-generating mech­a­nisms they have been impor­tant. There’ve been some very good ones made. 

The prob­lem is they could also be very effec­tive at lying. So once again we have to get the under­ly­ing eco­nom­ics and pow­er struc­ture right or what­ev­er good they can do will be nul­li­fied. But I’m not will­ing to just diss them. I think they’re a thing in their own right. I think they’re impor­tant. It’s like say­ing oh, for­get that black and white pho­tog­ra­phy, it’s just a pass­ing phase. I actu­al­ly think black and white pho­tog­ra­phy is a thing that per­sists because it has its own beau­ty, and some of these things have integri­ty and we should­n’t just think of them as only hav­ing val­ue because they’re on the way to some­thing else.

Mason: I mean, what is the sort of vir­tu­al real­i­ty expe­ri­ences that you are at least hop­ing for? What sort of words do you want to build?

Lanier: Well, for me… First of all let me men­tion there’s a lot of great work being done now, and I like to take the oppor­tu­ni­ty to men­tion younger design­ers. And so I’ll men­tion for instance Chris Milk, who did a work called [“Life of Us”] which is real­ly a lot… It’s a lot like the sort of thing I used to love the best. Your body morphs through evo­lu­tion­ary phas­es where you turn into dif­fer­ent crea­tures. And it’s social with oth­er peo­ple and it’s got a lot of ener­gy to it. He had to do it in a way that it’s a sort of a timed, sequen­tial expe­ri­ence because it’s intend­ed to be shown with docents in pub­lic places. So it’s not the sort of thing where you explore at your own pace. But I think it’s a suc­cess­ful… People have tried it? No, okay. Well, any­way I think it’s a real­ly good one.

As far as things I want to build, there’s… Oh god, I can men­tion some­thing. I’ll men­tion some oth­er things as we go through the con­ver­sa­tion. But what I notice now is that the small inde­pen­dent design­ers are in my opin­ion doing the best work. Although, some of the big stu­dios do good things, too. But a lot of the best qual­i­ties of VR come out in the small­est details that can be done by small teams work­ing with very lit­tle expense, actu­al­ly, if they’re careful. 

For myself, the goal I’ve always dreamed of the most is some sort of impro­visato­ry sys­tem where you’re inside and you play like vir­tu­al musi­cal instru­ments or some oth­er sort of thing. And by doing that you can change the world in any way and invent the world while you’re in it, and co-invent it with oth­er peo­ple so it becomes a shared inten­tion­al waking-state dream, as one used to say.

And the method of cre­at­ing tools that are capa­ble of that is still elu­sive. I’ve tried a lot of dif­fer­ent ways to do it, and I still believe it can be done. And I actu­al­ly have a whole thing in the book about that and the prospects for it. But that’s the thing I’d most love to see.

Mason: You talk about that prob­lem with the soft­ware that’s used to cre­ate vir­tu­al real­i­ty is often on a lap­top, on a two-dimensional lap­top, you can’t jump into VR and cre­ate just yet.

Lanier: Yeah, this thing of design­ing it in some pro­gram­ming lan­guage and then jump­ing in is…ridiculous. I mean that’s com­plete­ly miss­ing the point. 

And I should men­tion some­thing else. The way a lot of the com­pa­nies have set up stores where you down­load an expe­ri­ence and then you’re expe­ri­enc­ing it, is also wrong because it should be clos­er to like Skype than to Netflix. There should be live inter­ac­tions, but also there should be a role for live per­form­ers in it. There should be a whole new world of peo­ple who are sort of like pup­peteers, or dun­geon mas­ters or what­ev­er you might call them who are impro­visato­ry per­form­ers with­in vir­tu­al worlds, where that’s actu­al­ly the main point. Because that’s just much more appro­pri­ate to the medi­um than think­ing of it as a down­load. And I think that there’s been a real cat­e­go­ry error in the nascent vir­tu­al real­i­ty indus­try on that point.

Mason: Well why do you think we’ve run down this path of try­ing to re-present real­i­ty as it is and turn it into vir­tu­al worlds? Why do all these vir­tu­al worlds looks so famil­iar? Why aren’t we cre­at­ing oth­ered experiences?

Lanier: Well, I mean some— There are— As I say, I mean I think there are real­ly good design­er, so I should focus on them rather than the crap, of which there’s a lot. So I’ll men­tion Vi Hart. She does math expli­ca­tion vir­tu­al real­i­ty expe­ri­ences, but they’re extra­or­di­nary for learn­ing how to walk around in four-dimensional spaces or some­thing. And she has a won­der­ful group of peo­ple who build these things and they’re just fantastic. 

So I’ll men­tion anoth­er one. The thing is, let’s focus on the good stuff. I mean, why are there so many bad movies? I mean my god. I don’t think there’s any like, expla­na­tion need­ed for why there’s so much crap in any giv­en medi­um. I think that’s kind of—

Mason: But what I worry—

Lanier: Or maybe there is but it’s an old mys­tery, not a new one.

Mason: But where I wor­ry is peo­ple are com­ing to vir­tu­al real­i­ty and hav­ing these very kind of vapid expe­ri­ences where the agency is—especially in a 360 video—the agency is that of the direc­tor. They don’t get to have this kind of elat­ing, individualist—

Lanier: I know. It’s a big prob­lem, you know. I’m real­ly kind of bummed about that. I’m real­ly kind of bummed that a lot of peo­ple think they’ve expe­ri­enced vir­tu­al real­i­ty, and what they’ve actu­al­ly expe­ri­enced is some­thing that was pret­ty shod­dy. And that’s a drag.

But you know what? I mean… I just think that that’s what it’s like when a new medi­um comes along. I mean I think when cin­e­ma start­ed there was a lot of stu­pid stuff that peo­ple saw first, you know. I mean that’s hon­est­ly true. We like to remem­ber the high points but, there was actu­al­ly a lot of crap. 

For one of my ear­li­er books I start­ed look­ing at what books oth­er than the Bible were print­ed when the print­ing press became avail­able, and it was not all great. There was a lot of real­ly stu­pid stuff. So you know, I don’t know what moti­vates peo­ple to put all this work into mak­ing some­thing that’s real­ly shod­dy and stu­pid. But that’s true for like— You know, you look at a movie and like why—they had all this mon­ey. And then they threw it into this thing that you could tell that every­body who’s mak­ing it knows it’s a piece of crap. Like why? Why not to stop and say, Hey, we’re mak­ing a piece of crap. Let’s spend this mon­ey on mak­ing a bet­ter thing.” Like, why don’t they just stop for like— I don’t know why. I mean it’s like one of the great endur­ing mys­ter­ies. I just— I don’t get it but that’s what happens.

Mason: Well there’s tru­ly great vir­tu­al real­i­ty. And not great in terms of the con­tent but great in terms of what it’s able to do to the human indi­vid­ual. So we’ve already seen that vir­tu­al real­i­ty can be effec­tive at help­ing cure PTSD. It can be effec­tive as mor­phine at pain treat­ment. You can cre­ate empathy—

Lanier: Can I com­ment on both of those for a second?

Mason: Please, yeah.

Lanier: Yeah. Those are inter­est­ing to me in dif­fer­ent ways. On the PTSD treat­ment, this is a case where I was ini­tial­ly real­ly cyn­i­cal about it. When peo­ple start­ed doing research in that I thought, Oh this is just too cute.” This sounds like some­body just want­i­ng head­lines. It’s like, it’ll be real­ly catchy. Oh yeah, we’ll use VR to treat these things. But the clin­i­cal results came in and they were repli­cat­ed, and it turned out to be a real thing. So that was a case where I was a lit­tle too cynical. 

And then the oth­er one you men­tioned on pain relief. One of my stu­dents who’s now—she’s just become a pro­fes­sor in the med­ical school at Cornell in New York, Andrea Won, came up with an amaz­ing thing. Can I tell you about it?

Mason: Please. You can’t set it up like that and not tell us.

Lanier: Well I don’t know. So, every­body here will know the dif­fer­ence between what we can now call clas­si­cal occlu­sive vir­tu­al real­i­ty and mixture—also known as aug­ment­ed real­i­ty. So what she did is she took a pop­u­la­tion of peo­ple who had chron­ic pain but local­ized, and put them in social set­tings with mixed real­i­ty head­sets. Meaning that they still saw the phys­i­cal world but with extra stuff. And then they would paint on their bod­ies where the pain was. And it could look like a vir­tu­al tat­too or a Band Aid or some­thing. And then it actu­al­ly stuck with them. And to get that to hap­pen is a whole tech prob­lem. It’s not gonna hap­pen with an off-the-shelf HoloLens. But any­way, it hap­pened in the lab­o­ra­to­ry setting. 

And so they had these arti­fi­cial tat­toos that were per­sis­tent. And then grad­u­al­ly, over weeks and months, they start­ed to dis­si­pate, and we did­n’t tell them it was going to hap­pen. And then a sta­tis­ti­cal­ly sig­nif­i­cant num­ber report­ed their pain dis­si­pat­ing. Isn’t that cool? Yeah.

And then just as a reminder, every time some­body iden­ti­fies a tech­nique of that kind, pre­cise­ly the oppo­site hor­ri­ble, sadis­tic ver­sion of it is also hypo­thet­i­cal­ly avail­able. So it all comes down to incen­tives, pow­er struc­tures, ethics, soci­ety. Like, there’s no such thing as a com­put­er that’s going to be eth­i­cal on your behalf, or kind on your behalf. That’s up to you—it’s up to all of us. So this is not like some panacea sto­ry. It’s actu­al­ly a chal­leng­ing sto­ry if you under­stand the whole dynamic.

Mason: I was going to ask, if we’ve proven that vir­tu­al real­i­ty has these won­der­ful egal­i­tar­i­an uses and changes the mind and changes the body, then can it be used to inverse­ly not just cure trau­ma but invoke trauma?—

Lanier: Hell yes.

Mason: Could real­i­ty be the ulti­mate mind con­trol device?

Lanier: Yeah. Virtual real­i­ty has the poten­tial to be the creepi­est inven­tion. That’s absolute­ly cor­rect. This has also been stud­ied. So I have a co-researcher and friend named Jeremy Bailenson at Stanford. And we’ve been study­ing how chang­ing avatars changes peo­ple. So here are some things that have been not only pub­lished in peer-reviewed jour­nals but repli­cat­ed in mul­ti­ple places.

One is you can make some­body more racist with­out them real­iz­ing you did it to them. You can make some­body less con­fi­dent in a nego­ti­a­tion with­out them real­iz­ing you did it to them. You can make some­body more like­ly to buy some­thing they don’t need with­out them real­iz­ing you did it to them. And the list goes on. 

Mason: So this sounds like Facebook again.

Lanier: Well see that’s the thing. That’s what must not be allowed to hap­pen. So Facebook sim­ply has to change its busi­ness plan before Oculus gets any good.

Mason: So what’s the solu­tion for that? Is it gov­ern­ment reg­u­la­tion of VR and…?

Lanier: Well, I mean I think the solu­tion is just to talk like this until they’re shamed into doing it. I know that sounds crazy. But I think that’s— No, look at look at—you know Sean Parker is with the pro­gram. And I’m work­ing on Peter Thiel and we’ll even­tu­al­ly, like this thing is going to crum­ble. Because it just makes sense. It’s just like what’s hap­pen­ing is too stu­pid for peo­ple of good will who are not stu­pid to endure. Like they have to just change it.

Mason: I want to talk— Yeah, right. And the fact that there was silence is slight­ly dis­turb­ing, espe­cial­ly from this audi­ence. I was expect­ing every­body to leave and riot now. 

Look, this oth­er thing about VR… Do you think… The folks who see how effec­tive it is, they go back and say, You know, it’s just like…this thing like psy­che­delics.” You spent a lot of time I know with Timothy Leary, and they were talk­ing about cyberdelics. And that term is com­ing back into fash­ion recent­ly because it kind of proves the effi­ca­cy of VR. People say oh, it’s like a cyberdel­ic expe­ri­ence. And what you actu­al­ly get is like, 60s visu­als with 90s rave music over the top. [Lanier laughs] Well it’s…it’s terrible.

Lanier: That sounds like Burning Man.

Mason: Do you think Timothy Leary would be dis­gust­ed by the sorts of VR expe­ri­ences today? Would he go, This is not what I meant when I said cyberdelics! This isn’t a psy­che­del­ic expe­ri­ence, this is some­one’s bad fuck­ing trip.”

Lanier: Well, I have some fun­ny sto­ries about Tim in the book. I knew Tim pret­ty well. I used to— We had a lot of dif­fer­ent— We dis­agreed a lot. I mean, in a way I had a dia­logue with him that was a lit­tle like the one with Bill Gibson, where I… I kind of felt at the end of the day that the way he’d pre­sent­ed psy­che­delics maybe had back­fired, and maybe was­n’t so great. Because he was just super utopi­an about it. And…

There’s a fun­ny sto­ry about how I met Tim. so, I’d been com­plain­ing to Tim about— I’d been com­plain­ing but we’ve been com­mu­ni­cat­ing indi­rect­ly, most­ly through rants in like under­ground zines, which is what peo­ple used to do before there was an Internet, and there’d be like this zine in the back of a book store like this and it’d be ter­ri­bly print­ed and it would have like some poet­ry and weird art and stuff, and then…anyway. So we’d been com­plain­ing to each oth­er. And so final­ly he said, Okay, lets meet.”

I said, Great!”

And he said, Well, I have to teach this course in some­thing or oth­er in Esalen Institute.” Do you all know what that is? It was this super influ­en­tial kind of very utopian—it’s still there—sort of New Age insti­tute that’s locat­ed in the world’s most beau­ti­ful loca­tion on these cliffs above the ocean with nat­ur­al hot springs. And it’s where a lot of cul­tur­al trends start­ed that are asso­ci­at­ed with 60s or alter­na­tive cul­ture. And it’s been around for for quite awhile. But just a lot of lit­tle things like work­shops you might go to, or yoga, or food you might eat. A lot of stuff start­ed there. So it’s a real­ly big for­ma­tive place, culturally.

Anyway. So I said, Great. I can dri­ve down.” He was com­ing up from LA. I’ll dri­ve down and meet you at Esalen, and we can meet.” 

And he said, Well, actu­al­ly… I real­ly don’t want to teach this work­shop. So what I’ve done is I’ve hired this Timothy Leary imper­son­ator [audi­ence laugh­ter] and I’m going to smug­gle him in. He’s going to—” The guy who used to run it—or still runs it—is Michael Murphy. He’s a friend of mine still. And as as soon as he’s gone, I’m going to just have the imper­son­ator do the rest of my work­shop. And what I want you to do is smug­gle me out in the trunk of your car past their guard gate so that I can get out of doing this thing. And then smug­gle me back right at the end so I can get paid.”

And I was like, Suuure,” you know? Like, fine. Like, the stuff we do for mon­ey in Silicon Valley is a lot less dig­ni­fied than that… I might as well go for it. So. 

I had this real­ly super beat up jalopy that I’d had for­ev­er from New Mexico that… Oh, this is a whole— These sto­ries go on and on. But this thing, it was real­ly messed up. It did­n’t have back seats. It had hay in the back because I used it to move goats around and stuff. But the trunk was com­plete­ly filled with ear­ly com­put­ers, so I had to gath­er with a friend of mine at Stanford and we were like, dump­ing these com­put­ers that’d been worth a ton today. Like all these really—just to cre­ate enough space for him to sit there. And we were like well, would that fit Tim where we cre­at­ed this hole… 

And so I went down there and sure, he fit. Although these com­put­ers fell on him and he was like, Oh! Oh!” I think an old Apple Lisa fell on his head is what hap­pened, if you know what that is. 

Anyway. So I’m like all tense and like, I’ve nev­er smug­gled some­thing past a guard gate before. And I’m like, oh so like I’m gonna be in jail. This is going to be hor­ri­ble. I’m going to be a par­ty to fraud and some felony and my future’s gone… So I’m dri­ving up and there’s this guard gate and we’re dri­ving up to it. Hello, hello.”

And then there’s this total­ly stoned hip­pie guy who like can’t even look up, just like, Uhhhh…”

But like, [makes zoom­ing noise] Anyway, that’s how I met Tim.

Mason: So from that meet­ing you had, you know feel­ings for how he felt about VR.

Lanier: Oh oh oh. That’s right, you asked a seri­ous ques­tion. You’re con­fus­ing me.

Mason: I’m about to give up and hand it to our audi­ence in a sec­ond, but—

Lanier: That might be wise. I mean, it’s hard to real­ly know what Tim would have said. I know… Tim was a good guy. I mean, he want­ed the best for every­body. He um… I mean, I real­ly don’t know what— I think what he might say now is that boy, all those just, you know… I mean, the real­ly inter­est­ing ques­tion­s’s like, what would some­body like William Burroughs say. Because we’ve entered into a peri­od of such dark­ness. It’s… I don’t know. I mean I think he’d be heart­bro­ken the way we all are.

Mason: But is also a real oppor­tu­ni­ty with VR to imag­ine some of these alter­na­tive busi­ness mod­els that you’re talk­ing about. So the thing with the vir­tu­al real­i­ty plat­form is that its reliant on cap­tur­ing a degree of human data. And we may get to the stage where we’re cre­at­ing these expe­ri­ences through cap­tur­ing neu­ro data and bio data. Could vir­tu­al real­i­ty be the plat­form that reim­burs­es the indi­vid­ual in exchange for that sort of data?

Lanier: I’d like it. I mean…so here’s the thing. If data comes from you, and you’re in con­trol of it, and it has an influ­ence on what you expe­ri­ence, that’s one thing. If that is meshed with influ­ence from some unseen paid per­son who gets in the mid­dle of the loop and influ­ences what comes back to you as a result of your own data, that is manip­u­la­tion and that’s the end of human­i­ty. We can­not sur­vive that. 

So one of the things that’s for­tu­nate is that it’s very easy to define where the prob­lem is, because it’s an infor­ma­tion flow. There must not be an infor­ma­tion influ­ence from unknown third par­ties on how the feed­back loop works—that must be up to you. So that’s the first thing to say.

And then there’s a very nice fall­out from that, which is that’s also a path to endur­ing human dig­ni­ty as the tech­nol­o­gy gets bet­ter and bet­ter. And I’d just like to explain maybe a bit about that before we take ques­tions? Is that okay? 

Mason: Please, yeah.

Lanier: Right. So what’s been hap­pen­ing now is we steal data from peo­ple, and then we use the data to feed arti­fi­cial intel­li­gence algo­rithms. But because we call them arti­fi­cial intel­li­gence algo­rithms instead of just algo­rithms, we have to cre­ate this fan­ta­sy that these things are alive, that they’re free-standing. And so then we give this mes­sage to the peo­ple we just stole from, that they’re no longer need­ed and their only hope for sur­vival as the algo­rithms get bet­ter is that they’ll become wards of the state on basic income or some­thing. And it’s a very cru­el, stu­pid, demean­ing lie. 

The clear­est exam­ple I’ve found for intro­duc­ing this idea to peo­ple who haven’t heard it before is with lan­guage trans­la­tion. So the peo­ple who trans­late between lan­guages, like between English and German let’s say, they’ve seen a dras­tic reduc­tion in their life options. It’s sim­i­lar to what’s hap­pened to pho­tog­ra­phers, to record­ing musi­cians, to inves­tiga­tive jour­nal­ists, to many oth­ers. And the rea­son why is that the stu­pid lit­tle trans­la­tions, like trans­lat­ing mem­os, can now be done auto­mat­i­cal­ly. And by the way I love these auto­mat­ic ser­vices. They’re not great. I don’t want to see my book trans­lat­ed by Bing Translate. But on the oth­er hand I’ll use it for a memo or something. 

But here’s the thing. In order for these auto­mat­ic sys­tems to work we have to steal tens of mil­lions of exam­ple phras­es, from all over the world, in all the lan­guages, from peo­ple who don’t know they’re being stolen from, who are nev­er acknowl­edged and nev­er paid. And we’re telling those peo­ple that they deserve to not be employed any­more because they’re no longer need­ed because they’re being replaced by automa­tion, when in fact they’re still need­ed and they’re still the basis of every­thing and it’s not an automa­tion, it’s just a new, bet­ter chan­nel­ing of what they do. 

So the dis­hon­esty in that is cru­el and hor­ri­ble but it’s also cre­at­ing an absolute­ly unnec­es­sary pes­simism about the future, that peo­ple won’t be need­ed when in fact they will be. It’s one of the things that gets me real­ly angry. Because there’s this dog­ma oh, it’s AI, we’re cre­at­ing AI.” That’s been at the root of my hos­til­i­ty towards like arti­fi­cial crea­tures and stuff, because it ulti­mate­ly does destroy not just human dig­ni­ty but just the basic mech­a­nisms of humans being able to make their way.


Luke Robert Mason: And on that note, we’re going to hand this back to some humans. And I’m going to do two things. I’m going to hand you a dif­fer­ent mic, Jaron, so you’re a lit­tle more comfortable.

Jaron Lanier: No no no, I’m good. I’m good. Is this—can you hear me? Yeah yeah, okay. Everything’s working.

Mason: Alright. The one next to you’s a lit­tle loud­er. I might trans­fer you over. So are there any ques­tions? Eva.

Eva Pascoe: Hi. Thank you very much for fan­tas­tic sto­ries. We had the plea­sure of host­ing you at Cybersalon a good few years ago when one of the ear­ly Aibo dogs had just been released. And I remem­ber you took a strong dis­like to the Aibo dog. 

Lanier: Mm hm.

Pascoe: What’s your view of Sony bring­ing it back, as they just announced?

Lanier: Well, um…I think they should be smashed. They should all be destroyed. I mean— No, this is real­ly a bad thing. I… If you believe in a robot dog, in the same breath you’re believ­ing that you’re worth­less in the future—that you’ll be replaced. It’s a form of sui­cide for you, for the rea­sons I just explained. It’s eco­nom­ic sui­cide and ulti­mate­ly it’s spir­i­tu­al sui­cide. And I feel that strong­ly about it.

Now, that said there could be spe­cial cas­es. So if there’s some­body that for what­ev­er rea­son is made hap­py by this device and it helps them because of their very spe­cial issues or prob­lems, of course I’m not going to start judg­ing peo­ple on an indi­vid­ual basis. 

But over­all it’s a real­ly bad idea. It’s a bad idea in the sense that… I don’t know, it’s just like— It’s tru­ly a…it’s an anti-human idea. I expressed a the­o­ry the oth­er day in a piece in The Times that the rea­son cat videos are so pop­u­lar online is that cats are a spe­cial crea­ture in they weren’t domes­ti­cat­ed by us like dogs. They’re not obe­di­ent. They’re inde­pen­dent. They came along and demand­ed that they live with us, but they’ve kept their inde­pen­dence. And there’s this kind of self-directedness of cats which we love, and we see it in the videos. And it’s exact­ly the thing we’re los­ing in our­selves. So when we see cat videos online, we’re look­ing at the part of our­selves that’s dis­ap­pear­ing. That’s the long­ing that’s expressed. And that’s pre­cise­ly why you should­n’t have a robot cat.

Audience 2: Jaron, could you talk—

Lanier: Oh is that Kathlyn? [sp?]

Audience 2: Yes.

Lanier: Oh hey!

Audience 2: Hello! The musi­cal instru­ments, and the user inter­face, and the future of art in a pos­i­tive evo­lu­tion of technology.

Mason: Wonderful question.

Lanier: Well, I have an addic­tion prob­lem with musi­cal instru­ments. I have… I don’t know the count but it’s well over a thou­sand. And I play them. And I’m just a nut­case about it. And I’m not brag­ging, I actu­al­ly am con­fess­ing. Because it’s real­ly a sick­ness and it reduces the amount of air there is for my daugh­ter and wife to breathe in our home. And it pre­vents us from ever being able to walk from one point to anoth­er in our home in a straight line. 

But I love them for a bunch of rea­sons. One of them is they’re the best user inter­faces ever designed. They’re the most elo­quent, high­est acu­ity, of things you can real­ly become vir­tu­osic on. Not just at per­form­ing a par­tic­u­lar task but at impro­vis­ing in a broad palette. They’re by far the most expres­sive and open things that have been designed. That is to say com­pared to like a real­ly good sur­gi­cal instru­ment, which is real­ly designed for a par­tic­u­lar thing, a sax­o­phone is designed not for a par­tic­u­lar thing. In fact it’s most­ly played in a way that has noth­ing to do with what it was designed for.

So they have this incred­i­ble open poten­tial inside them, and peo­ple can spend their lives get­ting bet­ter and bet­ter and deep­er and deep­er with them. And that’s what I want most for com­put­ers. And I could go on and on about many oth­er qual­i­ties of them, but what I hope is that even­tu­al­ly com­put­ing will be like musi­cal instru­ments. It’ll be this thing that peo­ple can get deep­er and deep­er with, that is an open win­dow into a kind of an infi­nite adven­ture that is so seduc­tive that it does­n’t have an end. That peo­ple keep on grow­ing and growing.

This is a real­ly impor­tant point to me. One of the old debates in vir­tu­al real­i­ty is whether you could ever make a vir­tu­al real­i­ty sys­tem that’s so good that you’re done. That it fools you and that’s it. And I always said well no, you could­n’t. And the rea­son why is that peo­ple are not fixed. That as the VR sys­tems get bet­ter, peo­ple learn to per­ceive bet­ter. And so it’s not so much that the peo­ple are just like these fixed objects, then the VR gets bet­ter and then you’re done. No, the peo­ple change in response to the VR

And that process of peo­ple grow­ing through their per­cep­tion of tech­nol­o­gy is… Well, I used to frame it in almost apoc­a­lyp­tic terms. What I used to say is our job as tech­nol­o­gists is to kick­start adven­tures that are so seduc­tive that they’ll seduce us away from mass sui­cide. That’s the way I used to talk in my twen­ties. But I still think some­thing like that’s approx­i­mate­ly true. That if our moti­va­tion is just more and more pow­er, or more and more effi­cien­cy, or more and more opti­miza­tion, we’ll some­how just destroy our­selves. But if our adven­ture’s one of ever greater beau­ty and con­nec­tion and going deep­er and deep­er, then actu­al­ly that’s sur­viv­able. That can go on for­ev­er. So that’s the one that has to win. 

Mason: Question just here, Vinay Gupta.

Vinay Gupta: Hi, this is me call­ing in from the gods. So, I’m one of the blockchain folks. Although in the 90s I spent my time as a graph­ics pro­gram­mer wait­ing for VR to arrive. This is your fault. What do you think of the blockchain, both the hys­te­ria that sur­rounds it and also maybe the poten­tial for long-term mature tech­nol­o­gy. How does that inter­sect with things like the sto­ry about pay­ments will remove the mis­aligned incen­tives that pro­duce sur­veil­lance capitalism?

Lanier: So. Blockchain, yeah that’s been a ques­tion late­ly, isn’t it? There are a few things to say. The first thing, I’d like to draw a very sharp dis­tinc­tion between blockchain and cryp­tocur­ren­cies. So let me deal first with one and then the other.

So with blockchain… This sort of thing could be very impor­tant. Let’s remem­ber though that the way blockchain works is by explic­it­ly being as inef­fi­cient as pos­si­ble, alright. And so if that real­ly scaled up to run every­thing, it would be an uncon­scionable crime against the cli­mate. So the only way to use blockchain as we now under­stand it, at the scale of com­pu­ta­tion that would be need­ed to scale it, would be to find some sur­viv­able way. And in all seri­ous­ness maybe the blockchain servers have to be on the moon. Or some­where, you know. And of that might be a good thing to do with the moon. And I’m not being flip­pant. I’m actu­al­ly com­plete­ly serious.

But the oth­er thing is if you imag­ine some kind of nanopay­ment sys­tem on things like Facebook at a very vast scale, it’s pos­si­ble that the blockchain idea is like…emphasizes rig­or too much. And we need some­thing that’s a bit more…statistical and much much much much much more effi­cient. So that’s one thing.

The oth­er thing is the whole blockchain thing, as do many oth­er struc­tures, rest on what might be thin ice because we don’t have a for­mal proof that some of the encryp­tion schemes are real­ly going to with­stand… I don’t know. [inaudi­ble audi­ence com­ment (from Gupta?)] Yeah, I’m a lit­tle ner­vous about that. But that gets to be a kind of a geeky top­ic, alright.

But now let’s move to cryp­tocur­ren­cies. So here’s my ques­tion to you. Why is every fuck­ing one of the cryp­tocur­ren­cies that’s been launched to my knowl­edge a Ponzi scheme? Alright. 

Gupta: But all cur­ren­cies are Ponzi schemes.

Lanier: No, that’s not accu­rate. All cur­ren­cies are vul­ner­a­ble to being Ponzi schemes but they aren’t nec­es­sar­i­ly. And I’d refer you to Cain’s to under­stand that. If a cur­ren­cy can be well reg­u­lat­ed by well-intentioned peo­ple it does­n’t have to be a Ponzi scheme. It can grow and change with a society.

But some­thing like Bitcoin was pre­cise­ly designed to be a Ponzi scheme. I mean it unfair­ly and vast­ly ben­e­fits the ear­ly arrivals. And then every­body who does well on it lat­er, they have a low­ered sta­tis­ti­cal chance of doing well. But any­time they do well they make the founders do even bet­ter. So it is total­ly a Ponzi scheme, but since it’s so glob­al it just takes a real­ly long time to break com­pared to like, a Bernie Madoff scheme or some­thing, which is tiny in com­par­i­son. So, they’re hor­ri­ble. And even the ones since Bitcoin that are sort of more enlight­ened in var­i­ous ways are still Ponzi schemes. And there’s no like, dif­fi­cult math­e­mat­i­cal mys­tery to how to make one that isn’t a Ponzi scheme—but they’re all damn Ponzi schemes.

So, the thing is if we can’t get over the grandios­i­ty and greed of peo­ple in that com­mu­ni­ty, it’s not worth a thing. So the ques­tion is when do we start to make these things that are actu­al­ly for real?

Mason: That’s what scares me about that com­mu­ni­ty. They want to build Web 3.0, but they’re bor­row­ing the lan­guage of Web 2.0? to make indi­vid­u­als believe it’s going to be an easy tran­si­tion through, and I am deeply deeply con­cerned with friends and col­leagues of mine who are invest­ing in the cur­ren­cies and not actu­al­ly real­iz­ing that there’s real poten­tial and oppor­tu­ni­ty in that decen­tral­iza­tion, but it all gets thrown up into the memet­ic pow­er of what day trad­ing cur­ren­cies is. It’s such a loss.

Lanier: And we might end up with a nasty sur­prise when­ev­er the iden­ti­ty is revealed of the Bitcoin founder. I mean it could turn out to be Putin or some­thing, so. I don’t know, I mean I think it’s perilous. 

Mason: Any oth­er ques­tions at all?

Trudy Barber: Yes, it’s me, Trudy!

Mason: Oh, it’s Trudy Barber, the UK’s lead­ing cyber­sex­pert. [crosstalk] Make this a good one. 

Barber: Sorry! In the ear­ly nineties as an under­grad­u­ate at Central Saint Martins, I cre­at­ed one of the ear­li­est vir­tu­al sex envi­ron­ments, mak­ing a com­plete three-dimensional dig­i­tal space, with float­ing con­doms and all sorts of things in it. What I’m quite inter­est­ed in is as we’re see­ing ideas of you know, the 360 porn industry—which I don’t actu­al­ly see as some kind of vir­tu­al sex expe­ri­ence, I see it as a bit of a gim­mick. I’m quite inter­est­ed in how we are per­ceiv­ing our sex­u­al iden­ti­ties, or the poten­tial of per­ceiv­ing our sex­u­al iden­ti­ties, with­in a prop­er 3D vir­tu­al space where you have inter­ac­tion, where you have immer­sion, and how we per­ceive kind of…that whole idea of the phan­tom phal­lus, or the phan­tom vagi­na. And the way that we per­ceive our gen­dered expe­ri­ence of the world, and how we could real­ly expe­ri­ence and exper­i­ment with that in vir­tu­al spaces. So what’s your opin­ion on vir­tu­al sex in that context?

Lanier: Yeah. Well, alright so, this is anoth­er huge top­ic that could take a whole week or some­thing so I can only bare­ly prick the sur­face of.

Barber: Ooh!

Mason: Anything you say now, Jaron, this audi­ence is a British audi­ence. They’re going to read innu­en­do everywhere.

Lanier: I was told that Brits were very del­i­cate and unable to…

Mason: You haven’t met this audience.

Lanier: Okay. So, I’ll say a few things. Let me start with this sort of… Just the— God, how do I even begin here? So what I— Just in the— Oh God. Okay, look. What I think has hap­pened is our ideas about sex­u­al­i­ty of the last…going back to the late 1900s, with the very ear­li­est mov­ing images, with heliotropes and all, is we’ve had this very strange kind of arti­fact ori­en­ta­tion around sex­u­al­i­ty that turned into porn. Which is… I think it’s hard to remem­ber that things weren’t always that way, which is clear from his­tor­i­cal reading. 

Now, there’s been been lit­er­ary porn going back to ancient times and all sorts of stuff and it’s very inter­est­ing to read some of that. But there’s this very par­tic­u­lar reflec­tion of cin­e­ma, which I think has nar­rowed peo­ple. I think it’s cre­at­ed a set of clichés that peo­ple grow up with that is unfor­tu­nate­ly small, you know. It’s unfor­tu­nate­ly lim­it­ed. And I think in the future we might look back on the cin­e­mat­ic peri­od as one that was pro­found­ly anti-erot­ic. Because we were stuck in this feed­back loop with the things that were easy to film that we would see when we’re young, and that sort of thing

Now, I’m not sure that’s true but I sus­pect that’s true. And I should also point out that in pre-cin­e­mat­ic lit­er­a­ture, a lot of things about gen­der and sex­u­al iden­ti­ty are in fact more flu­id. In ancient lit­er­a­ture and even fair­ly recent lit­er­a­ture. I think that cin­e­ma had a sort of a cement­ing effect on us psy­chi­cal­ly, in a lot of ways. So that might be— I hope that does­n’t sound offen­sive to any film­mak­ers in the audi­ence, but. 

So when I was young, like in my twen­ties, I was real­ly fas­ci­nat­ed by erot­ic ideas in vir­tu­al real­i­ty. But I always thought that rep­re­sen­ta­tion of like, body parts or some­thing would be the least of it. And I was real­ly inter­est­ed in like, join­ing bod­ies with some­body inter­ac­tive­ly, try­ing to con­trol a shared body and learn­ing how to move togeth­er. Which is… You can do a lit­tle bit danc­ing and a lit­tle bit dif­fer­ent ways, but not like this. This is some­thing else, and it’s some­thing real­ly extra­or­di­nary that very few peo­ple expe­ri­ence these days because I just think for some rea­son they’re wast­ing time on stu­pid­er stuff.

Barber: Where would you see hap­tics engag­ing in this?

Lanier: Yeah, that’s an inter­est­ing ques­tion. You know, I was talk­ing about the cin­e­mat­ic era hav­ing an effect on sex­u­al­i­ty, and I sus­pect that that’s been some­what more pro­nounced for male sex­u­al­i­ty than for female sex­u­al­i­ty. And there’s a whole long dis­cus­sion one could have about why that might be so if it’s so and all that. And I think as far as the inter­sec­tion of tech­nol­o­gy and sex­u­al­i­ty, for women it’s already been more hap­tic. And I… 

So one might pre­dict that that’s like­ly to con­tin­ue and… In gen­er­al the hap­tic modal­i­ty’s the one that’s the crud­est and needs the most work in vir­tu­al real­i­ty. And it always— Like I’ve seen this again and again and again and again. Every time the stars line up and there’s a bunch of fund­ing for vir­tu­al real­i­ty work, whether it’s in a cor­po­ra­tion or uni­ver­si­ty or some­thing, the lion’s share goes to vision. And then the next biggest por­tion goes to sound. And then poor hap­tics, which is the hard­est one that real­ly should be at the front of the list, gets kind of like the crumbs. And it’s frus­trat­ing. It’s exact­ly back­wards, [crosstalk] but that’s just a repeat­ed problems.

Barber: There is the datafi­ca­tion of hap­tics start­ing to hap­pen now, with the record­ing of sex­u­al respons­es. And I think maybe that might be a dif­fer­ent way for­wards, com­bined with the vir­tu­al immer­sion, where the datafi­ca­tion of plea­sure might actu­al­ly expose dif­fer­ent ways that we actu­al­ly engage with our bod­ies. And also how it becomes like a com­mod­i­ty that is sold on. Which fits in with some of the oth­er argu­ments that you say.

Lanier: Well, per­haps so. I mean, the con­cept of plea­sure is one of the— Things like plea­sure, and hap­pi­ness is anoth­er one… I won­der if in the future we’ll under­stand these words to be…much broad­er and more process-oriented than we do now. Because I have this feel­ing that peo­ple tend to think of plea­sure as a sort of…a des­ti­na­tion, and hap­pi­ness as a des­ti­na­tion, almost like a for­mu­la that’s been ful­filled or equa­tion that’s been solved. And I’m sure that that’s the wrong way to think. And so what I hope is that will expand into much more of an ongo­ing, infi­nite process.

One of the books I used to quote all the time in the 80s, and I think I still men­tion it in the new book—I can’t remem­ber anymore—is called Finite and Infinite Games by James B. Carson. He pro­posed that just as a— In a first broad-brush way of under­stand­ing real­i­ty you can divide process­es into finite or infi­nite games, with a finite game being some­thing like a particular…well, a game of foot­ball, that has to come to an end. Whereas the over­all field of foot­ball is infi­nite and need not come to an end. And so you have to under­stand which things are finite and which things are infi­nite. And right now the way we’re approach­ing tech­nol­o­gy on many lev­els is finite. It has to come to an end, which would mean our end. But if we can coax it into its infi­nite cousin, then we have a means of sur­vival. And that’s true on every level—economically, aes­thet­i­cal­ly, and on every level. 

Barber: Thank you.

Lanier: Sure, thanks for the question.

Mason: Any oth­er ques­tions at all?

Audience 5: Hi. I love the way your open­ing gam­bit was to ask us if you’d read a par­tic­u­lar book. Which I had, by the way. And you were a bit dis­ap­point­ed that most of us had­n’t. And through­out your talk, actu­al­ly you talked about books and read­ing a lot, which might come a sur­prise to peo­ple who think it was going to be much more visual-oriented—

Lanier: Well I am hop­ing you’ll buy a book. I mean, that’s why I’m— Like I mean. You’re wit­ness­ing abject cor­rup­tion here. And you know, don’t don’t pre­tend it’s like some ele­vat­ed state of con­scious­ness. This is raw American sales­man­ship. [inaudi­ble com­ment from A5] Yeah okay, go ahead.

Audience 5: So what I want­ed to ask you, if it’s not too per­son­al, is what are you read­ing at the moment?

Lanier: Um. Yeah no, that’s a great ques­tion. I’ve been just read­ing the Lapham’s Quarterly spe­cial edi­tion on music, which has all sorts of real­ly inter­est­ing eth­no­mu­si­col­o­gy schol­ar­ship that I’d nev­er run across before. All sorts of obscure things, and and those are real­ly real­ly fun. And I’ve been read­ing… Well, Masha Gessen’s book on the decline of Russia, which is a very very sad and ter­ri­fy­ing thing to read. And you know, impres­sive and won­der­ful but not easy. And I’ve been read­ing… Let’s see. I’ve been read­ing Roger Penrose’s book on fads in physics. And I’ve been read­ing um… God, there’s so much stuff. 

I mean, I love read­ing. I just—I adore read­ing. And the thing about, I mean… I some­times won­der what—you know. We have a lit­tle bit in ancient lit­er­a­ture about peo­ple who were skep­ti­cal of read­ing when it was still some­what fresh and nov­el. That you know, we’ve been warned that it’ll ruin mem­o­ry. That it’ll make peo­ple weak. And you know what, I think that’s all true. I think that there’s a dark side to read­ing. But one of things I love about it is peo­ple sort­ed it out. Like, there’s been Mein Kampf, there’s been hor­ri­ble books, books have been used to manip­u­late peo­ple. And yet we’ve sort­ed it out, where over­all books are this won­der­ful thing. So as with musi­cal instru­ments they’re anoth­er inspi­ra­tion for what must hap­pen with computation.

Mason: We prob­a­bly have time for one or two more questions. 

Audience 6: Hello. 

Lanier: Hey!

Audience 6: It’s great to be here. I want to kind of draw on some­thing that you briefly men­tioned, and that’s the idea of empa­thy, which is becom­ing par­tic­u­lar­ly preva­lent, par­tic­u­lar­ly in the mar­ket­ing of VR as an empa­thy machine. And I kind of won­dered why you think we should kind of draw the line with that. For exam­ple, Mark Zuckerberg say­ing he felt when he went to Puerto Rico via VR, say­ing that he felt like he was actu­al­ly there. And this kind of thing that’s kind of per­pe­trat­ing in order to sort of mar­ket VR as a con­sumer prod­uct. So yeah, I was kind of want­i­ng what your thoughts are on the rhetoric of the empa­thy machine.

Lanier: Well…sadly what hap­pened in this case is he was, uh— They did this sort of pret­ty basic car­toon­ish thing where they had a cir­cle video of being in a flood­ed and destroyed part of Puerto Rico, with a cou­ple of Facebook exec­u­tives in as avatars, talk­ing about how mag­i­cal and amaz­ing it was to be there and it felt like it was real­ly there!” And then after­wards he apol­o­gized, say­ing, Oh, I for­got to men­tion that this was sup­posed to be about empa­thy for the peo­ple there.” Like he just total­ly spaced that part out at first and then had to cor­rect him­self later. 

So yeah, the whole empa­thy in VR thing used to be my spiel. I kind of start­ed that lan­guage. And I mean it can go either way. The truth is there’s noth­ing auto­mat­i­cal­ly empa­thet­ic about VR tech­nol­o­gy. I mean like, what I’m con­cerned about with the rhetoric of sell­ing it as empa­thy right now… I mean they stole my shit, that’s fine. But the thing that real­ly both­ers me is that it’s sug­gest­ing that the tech­nol­o­gy itself is going to pro­vide empa­thy, which is…ridicu­lous. I mean, for god sakes it’s like a gad­get you’re buy­ing from some com­pa­ny. It’s not going to be empath­ic for you. Like that whole notion that empa­thy comes from the tech­nol­o­gy is wrong. If tech­nol­o­gy can help a lit­tle bit at the mar­gins to help peo­ple express empa­thy or find empa­thy, that’s great, but it’s a human thing. And so that mis­com­mu­ni­ca­tion is just stu­pid. And this Facebook thing in Puerto Rico was like a great exam­ple of how it can go wrong. 

There are a few advan­tages and a few prob­lems to VR as an empa­thy device. The advan­tages are that it can con­vey aspects of human expe­ri­ence that can’t be con­veyed by oth­er media. It can con­vey what it’s phys­i­cal­ly like to walk in some­body else’s shoes. For instance there’ve been vir­tu­al worlds to con­vey what it’s like to have dif­fer­ent phys­i­cal dis­abil­i­ties or what­ev­er, and I think that that’s real­ly inter­est­ing work.

The bad side of it is that if you cap­ture the world suit­ably to replace some­thing about it in vir­tu­al real­i­ty, you’ve cap­tured enough of it that it’s eas­i­er to fake than if it was just a pho­to­graph. And so this is a bit of a sub­tle tech­ni­cal prob­lem. But if you take a pho­to­graph and you want to fake it, if you Photoshop it, there’s some things that are very…there are traces that are very hard to ful­ly cov­er. So you can write pro­grams to find the lit­tle weird pat­terns that are left over from that oper­a­tion. So there’s at least a lit­tle bit of a film of pro­tec­tion if you can get at the orig­i­nal dig­i­tal file. Of course if some­body’s just done a low-res ver­sion of it or some­thing, then at a cer­tain point you just are stuck. 

But if you’ve real­ly cap­tured a vir­tu­al world… Like, let me give you an exam­ple. Let’s say you take a pho­to of some­thing and you want to change what appeared to hap­pen so that you’re try­ing to gen­er­ate an out­rage machine based on shit­post­ing fake news instead of some­thing real. Which is the more typ­i­cal thing these days. That’s more com­mon than an attempt to tell the truth. Lying has been ampli­fied like a mil­lion times while the truth has been only ampli­fied by like three times or some­thing. It’s like a com­plete inequity. 

And so let’s say you’re try­ing to lie by chang­ing a pho­to­graph. Well, you have to look where all the shad­ows lie. You have to make it con­sis­tent. And there’s not going to be any algo­rithm that does that com­plete­ly per­fect­ly. You have to sit there and look at it. 

But if you’ve cap­tured it vol­u­met­ri­cal­ly, then all you have to do is move the sun. I mean, you all of a sud­den have turned it into a sim­u­la­tion, and a sim­u­la­tion is para­me­ter­ized and so it’s eas­i­er to change. So sud­den­ly you can lie with it better.

So, the empa­thy thing is real. I spent many hours pro­mot­ing the idea of vir­tu­al real­i­ty as an empa­thy machine in the 80s. It’s legit in a sense, but it’s all about you. The machine can’t do a damn thing for you. 

And I… I don’t know. This idea— I don’t— Techies just don’t seem to be able to…take a human-centric approach some­times. They just real­ly have to think of the tech as being where the action is. And it nev­er nev­er nev­er is. It isn’t even real with­out us. 

Mason: So, Jaron, I’m going to ask you one final ques­tion. Because the Virtual Futures audi­ence span about twenty-five years. It’s an inter­gen­er­a­tional audi­ence. And how can we bet­ter work togeth­er to change some of the issues that you raised, and togeth­er cre­ate a more desir­able future?

Lanier: Well—

Mason: Or are we just…fucked? We just…there’s no hope.

Lanier: You know, the speak­ing truth to pow­er thing is still legit. Like, just… I mean I think in a way it’s just sort of point­ing it out, real­ly helps, you know. I’d be either try­ing to stop using social media, or use it very care­ful­ly and dif­fer­ent­ly than you have or some­thing. All the peo­ple try­ing to change the world with social media, as I’ve explained ear­li­er, are com­plete­ly undo­ing them­selves at every turn and you have to learn to over­come that addic­tive illu­sion. And I think you have to turn to oth­er insti­tu­tions that are still standing.

And I’m wor­ried about us sav­ing our gov­ern­ments. I’m just afraid that it’s become so dis­tort­ed that it might—that we’re all falling apart here. So I think just try­ing to save gov­ern­ments is a good project right now. We used to wor­ry about them being too pow­er­ful and now I feel like they’re so frag­ile they could just vanish.

Mason: Well is it a case of what expands con­tracts? I mean, the tech­noc­ra­cy is the replace­ment of the gov­ern­ment and we’re hop­ing that the gov­ern­ment now is going to be the thing that fix­es the tech­noc­ra­cy, and we’re just going to con­tin­ue to see this expan­sion and con­trac­tion between the gov­ern­ment, tech­noc­ra­cy, gov­ern­ment, technocracy. 

Lanier: Yeah, I don’t know. I mean it’s a fun­ny thing. If we think of the tech com­pa­nies as the replace­ment for gov­ern­ment… Which a lot of peo­ple are doing. I mean it’s true to some degree already. I mean, the good news is that the peo­ple who run the tech com­pa­nies are still kind of rel­a­tive­ly young and they’re almost all real­ly nice. I mean…they’re good peo­ple, you know. I mean like, in a choice between Zuck and Trump I mean obvi­ous­ly Zuck, right? I mean there’s not even a… 

But the prob­lem is that the amount of wealth and pow­er that is con­cen­trat­ed in the tech com­pa­nies is so great that we can’t just think about the cur­rent batch of cute kids there, you know. We have to think about who’ll inher­it the pow­er. And when pow­er is con­cen­trat­ed it tends to be inher­it­ed by less attrac­tive fig­ures. So, Bolsheviks were cuter than Stalinists, let’s say, right. And so with Facebook, it’s the first large pub­lic cor­po­ra­tion con­trolled by a sin­gle per­son, and that has to be inher­it­ed some­how. There has to be some kind of way that turns to some­thing else. And there are all the schem­ing, hor­ri­ble peo­ple in the world, you know, and who owns it like in a hun­dred years if it con­tin­ues to be what it is?

So I think that the tech com­pa­nies, as gov­ern­ment, are the least rep­re­sen­ta­tive gov­ern­ments ever, you know. I mean even less so than a royal—like a… I mean, I think even like some­place with a roy­al fam­i­ly like Saudi or some­thing still has to be a lit­tle con­scious of whether the peo­ple are going to total­ly revolt. Whereas some­thing like Facebook has a kind of degree of impuni­ty because every­body’s addict­ed. It’s a com­plete­ly dif­fer­ent rela­tion­ship that favors them in a way that cit­i­zen­ship does­n’t nec­es­sar­i­ly favor a state. 

Mason: Well, on that note I hope there’s at least a lit­tle hope. And hopefully—

Lanier: I’m full of hope. I just uh…I’m a real­ist, though. I mean I think you can be a hope­ful opti­mistic real­ist, and it’s just…it’s just work, that’s all.

Mason: So, hope­ful­ly this audi­ence will keep doing exact­ly what you said. Keep hav­ing these issues out in pub­lic. And on that note I want to have a cou­ple of thank yous. Firstly to the Library Club for host­ing us. The Library Club, they are a won­der­ful venue. They’re very kind to us for hav­ing us here.

And a mas­sive thank you to Cybersalon and to Eva Pascoe. And it’s because of Cybersalon— This was a free invite-only event, and because of Cybersalon we’re able to cov­er our film costs. And every­thing that’s going to be pro­duced and released under Creative Commons so you can remix how­ev­er you like. 

Lanier: Part of what’s destroy­ing the world as I point­ed out.

Mason: Really? Alright, I’ll stick it behind a pay­wall and you can all pay us more.

Lanier: No no no no. You’d have to pay them because they cre­at­ed it.

Mason: Alright, I’ll pay you to watch the video. [audi­ence laugh­ter] Ugh. So look, we’re entire­ly audience-funded, and I don’t know what you’re going to say about Patreon but if you like what we do [Lanier laughs] please sup­port us. I don’t know, just…buy me a beer?

And you can find out more about Virtual Futures at vir­tu­al­fu­tures” pret­ty much any­where online. And of course a mas­sive thank you to Penguin Books who’re on the bal­cony. And Jaron’s signed a bunch of books that’re avail­able for sale. So please don’t all rush up there because they’re gonna get hound­ed. But the books are avail­able and for sale at the back. And thank you to the Penguin team for mak­ing this possible.

And I want to end with this, which is how we end every sin­gle Virtual Futures. And it’s with a warn­ing. Because the future is always vir­tu­al, and some things that may seem immi­nent or inevitable nev­er actu­al­ly hap­pen. Fortunately, our abil­i­ty to sur­vive the future is not con­tin­gent on our capac­i­ty for pre­dic­tion, although, and in those much more rare occa­sions, some­thing remark­able does come of star­ing the future deep in the eyes and chal­leng­ing every­thing that it seems to promise. I hope you feel you’ve done that today, and assum­ing that this isn’t an imper­son­ator, please put your hands togeth­er and join me in thank­ing the incred­i­ble Jaron Lanier.