Luke Robert Mason: You're in for a real treat this evening. I am blessed to be able to welcome Jaron Lanier to Virtual Futures. My name is Luke Robert Mason, and for those of you here for the first time, which is pretty much Jaron and nobody else, the Virtual Futures Conference occurred at the University of Warwick in the mid-90s, and to quote its cofounder it arose at a tipping point in the technologization of first-world cultures.

Now, whilst it was most often portrayed as a techno-positivist festival of accelerationism towards a posthuman future, the “Glastonbury of cyberculture” as The Guardian put it, its actual aim hidden behind the brushed steel, the silicon, the jargon, the designer drugs, the charismatic prophets and the techno parties was much more sober and much more urgent. What Virtual Futures did was try to cast a critical eye the phenomenal changes in how humans and nonhumans engage with emerging scientific theory and technological development. This salon series completes the conference’s aim to bury the 20th century and begin work on the 21st. So, let’s begin.

Luke Robert Mason: For this crowd, Jaron Lanier needs no intro­duc­tion. Our dis­tin­guished guest is cred­it­ed as invent­ing the iPhone. Except it was­n’t the iPhone, the small smart com­put­ing device made by Apple, it was the EyePhone, or E‑Y-E-phone, quite lit­er­al­ly a phone for your eyes.

His new book Dawn of the New Everything tells the sto­ry of this device and the VR start­up that cre­at­ed it, VPL Research Inc., a com­pa­ny that Jaron found­ed in 1984. Anyone notice the trou­bling irony? Nineteen…eighty…four.

Often cred­it­ed as hav­ing coined the term vir­tu­al real­i­ty,” it is Jaron that we have to thank or per­haps chas­tise for the vir­tu­al insan­i­ty that’s been plagu­ing all cul­ture of late. As the VR indus­try promis­es enhanced worlds in which we’re all gam­ing with each oth­er, Jaron’s book reveals VR’s dan­ger­ous poten­tial in allow­ing us to game each oth­er. Or, be gamed by more prob­lem­at­ic exter­nal pow­er struc­tures.

But this is not a dry book of dystopi­an tech­no­log­i­cal pre­dic­tions, it is hope­ful and it is hyper­linked. And when you read it you’ll know what I mean by that. This book is a man­i­festo for world­build­ing. And what’s so clear is that Jaron tru­ly and deeply under­stands what it real­ly takes to devel­op immer­sive and more impor­tant­ly, sat­is­fy­ing, expe­ri­ences in VR. And that’s not because he’s an accom­plished com­put­er sci­en­tist or tech­nol­o­gist, although those things are true, but it’s because he’s deeply con­nect­ed to what it means to be human.

So to help us under­stand the new every­thing, please put your hands togeth­er, stamp your feet, go wild, and join me in wel­com­ing Jaron Lanier to Virtual Futures.

Jaron Lanier: Hi!

Mason: So Jaron, I want to start at the begin­ning of the end. I want to start in the 80s when you moved to Silicon Valley. Because when it comes to the Internet, we got what we want­ed but it was­n’t nec­es­sar­i­ly what we thought it was going to be. How did you envi­sion cyber­space?

Lanier: Oh gosh. Well, I mean… That’s quite a hell of a ques­tion, because it’s a, it’s a big thing. Cyberspace is a par­tic­u­lar word that was made up by Bill Gibson, who writes books. And around the time we actu­al­ly— [com­ment from audi­ence about using his micro­phone] Oh, god. Alright here, how about that? Is that good? Can you hear?

Yeah, so around the time we incor­po­rat­ed VPL, Bill pub­lished a book called Neuromancer that I’m sure peo­ple remem­ber. And so the rule used to every­body has to come up with their own names, so there were a zil­lion names in cur­ren­cy for what we end­ed up call­ing vir­tu­al real­i­ty, which was my lit­tle ver­sion of it.

And Bill… Oh God you know. Something I feel like I can’t ful­ly tell this sto­ry with­out his per­mis­sion because some of it’s very per­son­al.

Mason: We’re among friends.

Lanier: Yeah. But I’ll leave out some of it. But he took not only— He was writ­ing in I think an extreme­ly impor­tant and won­der­ful dystopi­an tra­di­tion about this very thing. Much of it English, which I— How many peo­ple have read The Machine Stops by E.M. Forster? Oh, please…

Mason: That’s dis­ap­point­ing for [inaudi­ble].

Lanier: Oh come on, peo­ple. Come on. Get with it.

Alright. So this was writ­ten some­thing like 110 years ago. And it’s a sci­ence fic­tion novel­la. And this is the fel­low you know who wrote Room with a View— Can you guys still hear me, by the way? Is this is close enough? Yeah? No?

Audience Member: Not very well, no.

Lanier: Here, I’ll just kiss this stu­pid piece of foam, for you. Is that dystopi­an enough for you? Is this what it was all for?

Okay, so. So any­way. So I for­get if it was…it’s some­thing like 1907, 1908, 1909, some­thing like that. E.M. Forster writes this novel­la, and he describes a pop­u­la­tion addict­ed to these screens con­nect­ed to giant glob­al com­put­ers. And the screens are hexag­o­nal, sug­gest­ing that the peo­ple have become like a bee­hive, like a hive mind. And they talk about top­ics and they do lit­tle video calls. And they all get kind of lost and every­thing becomes a lit­tle unre­al, and you get the feel­ing that they’re all sub­ject to this pow­er struc­ture. And then there’s a crash. It’s not very reli­able machine. There’s a crash and all these peo­ple die, and they final­ly crawl out out of their cubi­cles and they see the sun, Oh my god, the sun. Reality.”

Anyway. This was 110 years ago. And in a way he nailed it. I mean, all of the dystopi­an sci­ence fic­tion lit­er­a­ture since then, whether it’s the feel­ies, or the Matrix, or so many oth­ers, are in a sense echoes of The Machine Stops, so you ought to read this thing. I mean if you’re inter­est­ed in this world.

And in a sense I thought Neuromancer was in the tra­di­tion. It was anoth­er… That The Machine Stops is such a pro­found work it actu­al­ly cre­at­ed its own lit­er­ary genre that con­tin­ues to this day and I thought Neuromancer was some­thing like a cross between William Gibson and E.M. Forster or some­thing like that. But it was also much more than that but that’s a sto­ry for anoth­er time.

Anyway. I used to argue with Bill about Neuromancer. Because he would send me scenes he was writ­ing. He’d been writ­ing short sto­ries in the same style before­hand. And at that time, my thing was vir­tu­al real­i­ty could either end up as the most beau­ti­ful medi­um that brings peo­ple togeth­er and helps bridge the ever-mysterious inter­per­son­al gap. It could be the great­est form for art ever. It could be this thing that makes imag­i­na­tion more val­ued. It could make it pos­si­ble to extend some of the more lumi­nous aspects of child­hood into life with­out los­ing them into the rest of one’s life. There were all these hopes I had for it. I hoped it could be a bea­con of empa­thy that would help peo­ple behave bet­ter.

But, it could also turn out to be the creepi­est thing of all time because it could be the ulti­mate Skinner box. Do you know what I mean by Skinner box? Everybody know that ref­er­ence? Okay. Because I nev­er know these days what any­body knows.

So it could be like this mind con­trol hor­ri­ble thing that could be real­ly awful. It could be the creepi­est inven­tion ever. And I thought it would flip one way or the oth­er and if we kind of set the right tone at the start, that might help set it on the pos­i­tive course. So I used to call up Bill and bug him. And I’d say, Bill, your stuff’s so creepy! You have to make it nicer!”

And I’m not real­ly good at doing voic­es but at that time he still had this incred­i­bly strong Tennessee Southern accent. It was like, Well Jaron, you know—” I can’t do it, but any­way he would say you know, I’m a writer! I write what I write. I can’t change it for this pro­gram of yours even though it’s very nice.”

But when we start­ed to get VR stuff work­ing around oh, I don’t know, 84-ish or some­thing like that, he would come by. Which was hard at that time. I don’t know if— He was stuck in Canada, because he’d gone there to avoid being draft­ed into Vietnam. So it was hard— His sit­u­a­tion was dif­fi­cult at that time. But any­way, we’d get togeth­er and he would say, You know, if I had it all to do over I would just work on this tech instead of being a writer.”

And I’d say, Be a writer.” But then I said, Okay, but hey. If you want to come work, you know…”

And he was like, Uh…maybe I’ll be a writer,” after he saw the real­i­ties of Silicon Valley, where you work all night and you just com­plete­ly work your­self to the bone on and on and on.

Anyway. So at that time so long ago, the way I thought about it is it could flip either way and what we had to do is just kind of say the right incan­ta­tions, just get it set on a good course. And where I am today on it— Is this answer­ing your ques­tion at all, or have I total­ly—

Mason: I’m enjoy­ing the sto­ries, so I’m going to let you con­tin­ue. But Jaron, the ques­tion I was going to ask you was with regards to the Internet itself. So cyber­space is what Gibson envi­sioned, but the Internet, do you believe went off course?

Lanier: Well, yeah. I mean, kin­da obvi­ous­ly. I mean… Let us count the ways. I mean, under the cur­rent Internet as it exists right now, before we even talk about our own sit­u­a­tion in the United States, I’ll men­tion that what’s called in the trade shit­post­ing” seems to’ve played a crit­i­cal role in foment­ing the Rohingya cri­sis, in desta­bi­liz­ing some parts of India, some oth­er issues in parts of Africa—particularly Sudan. And so there are peo­ple dying who do not have the ben­e­fit of an advanced soci­ety or sta­ble gov­ern­ment as a back­up. So out in the wild it’s absolute­ly dead­ly.

I could also talk about the absolute­ly extreme and unten­able wealth and pow­er con­cen­tra­tion that’s been engen­dered by the dig­i­tal net­works, that mir­rors late 19th-century Gilded Era’s but is prob­a­bly worse, and is absolute­ly unsus­tain­able and head­ed off a cliff.

I could also men­tion the desta­bi­liza­tion of democ­ra­cies all over the world that before social net­work­ing as we know it showed up, we had thought that once coun­tries went demo­c­ra­t­ic they would tend to stay that way and per­fect it. And instead we see the United States, Turkey, Egypt, Austria…I mean who knows.

I hate to even say these names because I want to be wrong. Like I mean it’s hor­ri­fy­ing to even think of this. Just speak­ing of the United States, it’s a thing we nev­er expect­ed and it’s a ter­ror, and it’s…I don’t know where it’ll end. I don’t know how bad it’ll get. And it’s very clear­ly relat­ed to what’s hap­pened with dig­i­tal net­works. So I would describe it as an abject fail­ure. It’s…the pro­gram of my gen­er­a­tion of com­put­er sci­en­tists has failed. So…

Mason: Well, John Perry Barlow said in The Declaration of the Independence of Cyberspace that this should not be a con­struc­tion project. And yet it has become a pub­lic con­struc­tion project. Is it time, Jaron, for a demo­li­tion.

Lanier: Mm. Well, my take on it—oh god. This is a tough one. Barlow and I are on oppo­site sides of inter­pret­ing this, and I think it’s kind of bro­ken his heart and I don’t know what to do about that. We were kin­da back at the time, too.

Okay so here’s what I think hap­pened. What I think hap­pened… And this is not to lay blame on any­body, because I don’t think any­body real­ly knew for sure how this stuff would work. This was all rather mys­te­ri­ous and exper­i­men­tal. But what I think hap­pened it was, back in the 90s and into the turn of the cen­tu­ry, there was an extreme­ly strong sen­si­bil­i­ty that things should be free. That infor­ma­tion should be able to move with­out restric­tions relat­ed to pay­ment. There should be free music. Email should be free.

For instance, in the 90s there was this con­tro­ver­sy about whether some kind of tiny home­o­path­ic amount of postage might be good on mail. And the rea­son for it was very sim­ply— I mean, there was an imme­di­ate rea­son and there was a long-term rea­son. The imme­di­ate rea­son was that even the slight­est amount of postage would shut down spam, which is already a gigan­tic prob­lem, right. So it’s one thing if you spam like mil­lions of peo­ple, even a tiny amount of postage cost some mon­ey, right.

But then the longer-term thing is that if there was some sort of com­merce built in from the start, then even­tu­al­ly when the robots start doing more and peo­ple take on new roles at least there’s a chance they could get paid instead of becom­ing wards of the state. So it holds out some kind of hope for that option which would­n’t be there if some kind of pay­ment was­n’t gen­er­al­ized, right.

So those were the two rea­sons. But this idea that things should be free total­ly swamped that. People were just very very like, sort of mil­i­tant­ly dog­mat­ic about it, and many still are. But here’s the prob­lem. We then cre­at­ed this kind of super-open Internet where every­thing can be free, noth­ing’s traced, you can make copies of any­thing. And I can get back to this whole idea of being able to make open copies of things just means that noth­ing has con­text any­more. You don’t know where any­thing came from and that— The orig­i­nal idea for social net­work­ing was to not make copies because copies are an inef­fi­cien­cy any­way and you just trace back to the orig­i­nal, both for pay­ment and for con­text so you would know what things are.

[To audi­ence:] So when was the first design for a dig­i­tal net­work? What year? 1960.

Audience Member: 1969?

Lanier: 1960. And it was by Ted Nelson at Harvard. Yeah. He did­n’t call it Xanadu yet, but he designed one in 60 and it was the first one. The pack­et switch idea pre­dat­ed it but there was­n’t an archi­tec­ture. And in that one, you did­n’t copy because copy­ing seemed like an uncon­scionable expense. Instead you traced back to ori­gins. But his rea­son­ing for that was extreme­ly sound, which is then every­thing has con­text and some­body can’t mis­rep­re­sent some­body else’s con­text and appro­pri­ate it. Plus if you want to have cap­i­tal­ism peo­ple can be paid. And if you don’t want cap­i­tal­ism you don’t have to, but it’s an option. So it increas­es your options. So that was his argu­ment in 1960, and so that was absolute­ly reject­ed by this feel­ing that things must be free.

And this idea that things must be free had mer­it. I mean, I under­stand the argu­ments. There were a series of things that had hap­pened that… Well, I men­tioned Bill was avoid­ing the draft. There were a lot of peo­ple that felt that the abil­i­ty to hide was the most impor­tant thing ever in free­dom. And so they ignored oth­er things that might also be impor­tant. That’s the short answer to how it hap­pened,

But any­way. So we cre­at­ed this net­work design where every­thing’s free, every­thing’s copy­able, con­text is lost. But then we thrust it for­ward into— And the we” is spe­cif­ic peo­ple, many of whom are very straight. Al Gore essen­tial­ly did invent it as a polit­i­cal project. Just to be clear that was actu­al­ly what hap­pened. You might not even know that con­tro­ver­sy, but— [inaudi­ble audi­ence com­ment] The pipes thing was some­body else’s com­ment. But any­way. That was a Bush com­ment.

We thrust this open net­work into a cap­i­tal­ist con­text where the larg­er soci­ety was still in one where you had to pay for rent and where you were expect­ed to start cor­po­ra­tions, and if some­body put mon­ey into some­thing they want­ed a prof­it and all of this. And we made a very raw Internet that did­n’t do much. It did­n’t have iden­ti­ty mech­a­nisms, it did­n’t have pay­ment mech­a­nisms, it did­n’t have per­sis­tence mech­a­nisms, it did­n’t have…really much of any­thing. It was just this very raw thing.

So the ques­tion is what do you do? So even­tu­al­ly some­body invent­ed the World Wide Web on top of it, and… A British thing. How about that, yet anoth­er local thing. Tim Berners-Lee. Well, he did it in Switzerland, so. I don’t know.

Mason: Well, we have cur­rent issues with Europe, but.

Lanier: No no, I would like to fin­ish. Sorry no, you don’t get anoth­er ques­tion yet. I need to fin­ish this.

So here’s what hap­pened. If you tell peo­ple you’re going to have this super-open, absolute­ly non-commercial, money-free thing, but it has to sur­vive in this envi­ron­ment that’s based on mon­ey, where it has to make mon­ey, how does any­body square that cir­cle? How does any­body do any­thing? And so com­pa­nies like Google that came along, in my view were backed into a cor­ner. There was exact­ly one busi­ness plan avail­able to them, which was adver­tis­ing. And adver­tis­ing does­n’t sound so bad until you remem­ber what all of that cau­tion­ary lit­er­a­ture point­ed out for so many years (I should also men­tion Norbert Wiener, who’s an impor­tant fig­ure in cau­tion­ary lit­er­a­ture.), which is that in a cyber­net­ic struc­ture, if you have a com­put­er mea­sur­ing peo­ple and then pro­vid­ing feed­back based on that mea­sure­ment, you are no longer offer­ing them per­sua­sive com­mu­ni­ca­tion. Its like a rat in a Skinner box. You’re mod­i­fy­ing their behav­ior and you’re addict­ing them. You’re doing both of those things. And you’re doing it inevitably, irrev­o­ca­bly. And so essen­tial­ly what we said is the only thing you’re allowed to do on the Internet is build a behav­ior mod­i­fi­ca­tion empire, every­thing else is dis­al­lowed. So it was a project in a way of the left that cre­at­ed an author­i­tar­i­an Internet. And I think it back­fired hor­ri­bly and I think it was dis­as­trous. And that’s the thing that needs to be undone.

Mason: So to some degree, Jaron, it was log­i­cal that this would be the out­come. Why do you think we’re see­ing now the indi­vid­u­als who were there at the begin­ning start­ing to come out against some of these, as you call them, behav­ior mod­i­fi­ca­tion empires? Both Ev Williams, and only a week ago Sean Parker, have both tried to step up to be the Cassandra of Silicon Valley and go, Well, we knew this was going to hap­pen.”

Lanier: You know… I need to ask Sean about that. The day before he said this thing about how, Oh yeah, we inten­tion­al­ly set up this addic­tion loop,” there was a piece, an inter­view of me by Maureen Dowd in The New York Times about that mech­a­nism, and it’s pos­si­ble that it was tied.

But this has been an open secret. Everybody’s kind of known that this was hap­pen­ing. And I think peo­ple have to come out against it because the world’s being destroyed. It’s a mat­ter of sur­vival. I mean, it’s real­ly becom­ing so dan­ger­ous.

Mason: So then the ques­tion becomes how? So how do we divorce these plat­forms from the non­hu­man agency of cap­i­tal­ism which has mor­phed them into these prob­lem­at­ic enti­ties, these behav­ioral mod­i­fi­ca­tion empires, as you call them?

Lanier: Yeah… Well, see this is the trick, isn’t it? I’ll make a few obser­va­tions. One obser­va­tion is that if we want to live in a cap­i­tal­ist soci­ety and if some­body has a pro­gram that they think is bet­ter than cap­i­tal­ism, I’m not ide­o­log­i­cal­ly opposed to such a thing, it’s just not gonna hap­pen in three sec­onds. So, just for the moment in the short term we have to sur­vive in a cap­i­tal­ist soci­ety. And so we have to think about how there can be busi­ness­es that don’t rely on behav­ior mod­i­fi­ca­tion.

And for­tu­nate­ly there are a lot of them. I mean, for instance, um… Oh god, I don’t know where to begin. Of the giant tech com­pa­nies there’s only two that do it. Google and Facebook do it. Facebook more so. Amazon, Apple, Microsoft and many oth­ers actu­al­ly sell goods and ser­vices. And you might feel that they should be crit­i­cized for var­i­ous things, and you might very well be right that they should be. But not for that, alright. I mean, they might have lit­tle exper­i­ments in that direc­tion but they don’t really…that’s not their main thing.

So it’s real­ly two big com­pa­nies and most­ly one big com­pa­ny, and then a few small­er com­pa­nies, that are fail­ing as busi­ness­es. I mean Twitter as a busi­ness is this real­ly wob­bly thing. So it should want to try dif­fer­ent busi­ness ideas. And it’s not hard to imag­ine what these might look like. So if there were like a hun­dred thou­sand com­pa­nies doing it that would be hard­er, but it’s real­ly kind of boils down to just a hand­ful that need to be changed.

Could that be done by reg­u­la­tors? Maybe. I think what makes more sense is to try to just kin­da cajole the com­pa­nies, just say, What you’re doing is real­ly stu­pid. You’re decent peo­ple of good will. Just do some­thing dif­fer­ent.”

And the way they transform—like you could imag­ine a tran­si­tion— There are a bunch of dif­fer­ent paths, but the one that I think is the eas­i­est and makes the most sense is Facebook says, Hey you know, if you have pop­u­lar posts we’re gonna start pay­ing you. And then grad­u­al­ly we’re going to also start charg­ing you but a very low amount, and if you’re poor, noth­ing. But if you’re not poor at least a lit­tle tiny bit.” So this is get­ting back to like dig­i­tal postage. And we’re going to grad­u­al­ly start refus­ing peo­ple who want pay to manip­u­late you. And then we have a tar­get that in five years or what­ev­er it is, this thing will become a mon­e­tized social net­work instead of a manipulation-based one. And there’ll be no hid­den third par­ties who’re pay­ing to affect what you expe­ri­ence. And boom, done. And then Putin has to go cry in a cor­ner.

Mason: So do you think where we are now is just a pass­ing fad? Do you think this will come to pass and we’ll look back on these twenty-five years where we end­ed up with social media and go, Ha, weren’t they so sil­ly to build it that way?”

Lanier: Well that’s cer­tain­ly what I want. I want this peri­od to be remem­bered like a bizarre bad dream that we passed through. I want this to be remem­bered like oth­er stu­pid things that’ve hap­pened as just this his­tor­i­cal peri­od that was just incred­i­bly strange that we try to teach kids about and won­der if we’re real­ly doing it well enough. I want it to be like that.

But I don’t know, though. I mean, this could get a lot worse before it gets bet­ter. I just— I don’t know where it’s going. It’s not clear how bad things are going to get in the US. But it’s bad. It’s real­ly bad. It’s scary. I mean, I live in Berkeley, California, and peri­od­i­cal­ly right wing demon­stra­tors come to try to pro­voke fights. And some­thing’s hap­pened which has nev­er hap­pened before, which is once in a while on these days when they’re com­ing and hop­ing for a fight, there’ll be these guys like in pick-ups dri­ving round and they’ll pre­tend to swerve at you if you look left­ist. And then they’ll cut back and just go off. But it’s a weird, scary thing. And peo­ple have start­ed just stay­ing in their homes. It’s like a thing that has nev­er hap­pened before. So it’s bad. It’s real­ly bad. And all these peo­ple live in this oth­er real­i­ty which was cre­at­ed by— If they’re old enough it was cre­at­ed on cable news, but this pop­u­la­tion it’s all social media.

Mason: Well let’s start talk­ing about oth­er real­i­ties. Because the sit­u­a­tion we’re in with the fake news and the way in which peo­ple’s per­cep­tion is manip­u­lat­ed by these empires is a form of vir­tu­al real­i­ty, is it not?

Lanier: No…

Mason: No?

Lanier: No, I mean I— Well, I mean lis­ten, I don’t own the term or any­thing so you have as much right to define it as any­one. But it cer­tain­ly isn’t… It does­n’t cor­re­spond to any of the ways I use it. If you want to use the term to refer to an instru­men­ta­tion strat­e­gy or some­thing like that, that’s okay. If you want to use it for mar­ket­ing your prod­uct or some­thing, I guess what­ev­er.

But if you want to talk about it in broad­er philo­soph­i­cal terms, what I hope we’re talk­ing about is a medi­um of per­son­al expres­sion that might be used by mys­te­ri­ous third par­ties but so far has­n’t been because it’s so nascent. I mean, what I hope is that this peri­od of the dark­ness of social media will help us sort this out before vir­tu­al real­i­ty becomes more com­mon­place. Because it’s going to be much more potent than this real­ly crude stuff like you know, Facebook on a phone, which is real­ly not much of any­thing com­pared to what’ll come, you know.

Hey by the way, you’re all smart, hip peo­ple, right? Will you delete your accounts?

Audience Member: Facebook?

Lanier: Yeah. Get rid of it, it’s stu­pid. You don’t need it. You think you need it but it’s an arti­fi­cial addic­tion. Just get rid of the stu­pid thing. Like come on.

Mason: Get rid of Facebook, but if you are fol­low­ing this con­ver­sa­tion on Twitter—

Lanier: Oh, Twitter too. Get rid of Twitter. Twitter’s real­ly stu­pid. Just stop it.

No, actu­al­ly— Let me say a cou­ple things about that. What hap­pens on Twitter and Facebook might be quite beau­ti­ful and amaz­ing. I’ll give you an exam­ple. In the US there was a move­ment that start­ed on social media called Black Lives Matter that brought aware­ness to a phe­nom­e­non— This is anoth­er thing, it was kind of an open secret and any­body who knew any­thing knew it was going on. But some­how it just shift­ed into some­thing that was an open open secret that peo­ple actu­al­ly talked about. It made it more real. And this was this hor­ri­ble phe­nom­e­non of unarmed black kids sud­den­ly get­ting killed after a traf­fic stop, over and over and over again, the police not being pros­e­cut­ed. And it was like this nation­al blood sport or some­thing, it was this hor­ri­ble thing.

So there was this move­ment to raise aware­ness about it and to try to reform police depart­ment. So, Black Twitter is a form of lit­er­a­ture. It’s a beau­ti­ful thing. It’s a legit­i­mate­ly extra­or­di­nary lit­er­ary phe­nom­e­non. And it was fun when like, Trump engages with it and they total­ly run rings around him, black Twitter users. So it’s cool.

But here’s the thing, though. At the same time that peo­ple are using Twitter for some­thing like Black Lives Matter, or cur­rent­ly for #MeToo, there’s this oth­er thing going on which is that the algo­rithms, with­out any evil genius direct­ing them, the algo­rithms are putting peo­ple into bins who were doing this. And then what they’re doing is they’re auto­mat­i­cal­ly with­out any evil genius direct­ing them, test­ing what it’s like when oth­er peo­ple react to peo­ple from those bins who are in say, Black Lives Matter or #MeToo or what­ev­er. And then the algo­rithms are nat­u­ral­ly tuned to fur­ther what’s called engage­ment,” which we might more prop­er­ly called addic­tion.

And so if there’s some­thing from Black Lives Matter or #MeToo that then upsets some oth­er group, the algo­rithms will all auto­mat­i­cal­ly opti­mize that to upset them as much as pos­si­ble, and vice ver­sa. So now the peo­ple are auto­mat­i­cal­ly, with­out any evil genius direct­ing them, form­ing them­selves into groups. And then what hap­pens is adver­tis­ers or sabo­teurs or weird bil­lion­aires with a stu­pid agen­da or you know, Russian infor­ma­tion warriors—whatever it is. They come along and they use the Facebook tool and they say, Oh, what can I buy? What can I buy?”

And then this thing is like super opti­mized. It’s a super bar­gain. So like, Yeah!” So they do that, then they push it even more, and it goes more and more. So the inter­est­ing thing is that Black Lives Matter, through its suc­cess, under­mined itself because of this struc­ture. It’s a dirty trick. It’s a dirty dirty trick. At the very moment that you’re suc­ceed­ing, you’re build­ing in your fail­ure. And you will suf­fer in the polls. You will cre­ate a soci­etal reac­tion against you, par­tic­u­lar­ly if you start out in a minor­i­ty posi­tion or in a rel­a­tive­ly unem­pow­ered posi­tion. It’s a fun­da­men­tal­ly impos­si­ble game.

Now, I hope that that’ll be proven some­what wrong in the American elec­tions. That it’s just so out­ra­geous that we’ll be able to gath­er enough steam, but I’m not sure. This mech­a­nism intrin­si­cal­ly undoes and inverts social progress. It’s a social progress invert­er, and it’s built in and there’s no way to fix it with­out chang­ing the busi­ness struc­ture.

Mason: Well this time around, Jaron, real­i­ty is at stake because it may go into these vir­tu­al real­i­ty realms. And we look at things like Facebook social VR and I have to ask what did you think of that when you saw Oculus pur­chased by Facebook in 2014? Did you go, Great! They’re buy­ing these com­pa­nies. My dream will come true.” Or did you go, Oh god, no. Hell no.”

Lanier: Yeah. I had real­ly mixed feel­ings. I mean, I was kin­da hap­py in a way because it’s was… I mean, I like to see peo­ple enjoy VR. I like to see peo­ple feel that it’s worth invest­ing in. Oculus in par­tic­u­lar was stu­dents of a friend of mine who cre­at­ed a start­up out of his class, Mark Bolas? Are you aware of him? Mark Bolas was then a pro­fes­sor at University of Southern California. And his stu­dent projects includ­ed cre­at­ing the card­board view­er that Google adver­tis­es as Google Cardboard or what­ev­er they call it, and the Oculus Rift. And I thought you know, this is cool. Like, VR’s fash­ion­able and peo­ple like mon­ey so this’ll— I mean obvi­ous­ly I was enthused about that. I thought I thought then and I still believe that Facebook will change its busi­ness plan before it’s too late. So I still kin­da believe that, you know. I real­ly do.

Mason: Most peo­ple have their first vir­tu­al real­i­ty expe­ri­ence with the Oculus. They lose their VRginity to the Oculus Rift. But you lost your VRginity back in the 1980s. There was the ear­ly days when you were deal­ing with these very very clunky devices. What was it like to cre­ate those sorts of brand new vir­tu­al real­i­ty devices?

Lanier: Well I mean, just to be clear. Like the EyePhone… The bet­ter of the two EyePhone mod­els was every bit as good as a cur­rent Oculus Rift or Vive or some­thing. They were just super expen­sive. But yeah, we had to make them. We had to think about like how do you actu­al­ly man­u­fac­ture this thing? How would you… Nobody had ever made any­thing like that before. Nobody had made the wide angle opti­cal thing and fig­ured out how to mount it on a head and how to mount it on dif­fer­ent kinds of peo­ple’s heads, and just the whole busi­ness. We had to invent a prod­uct cat­e­go­ry.

Mason: And then you say in the book the won­der­ful thing about these big clunky pieces of hard­ware is it makes the VR more eth­i­cal.

Lanier: Yeah, absolute­ly. So the deal here— So what is the dif­fer­ence between a magi­cian and a char­la­tan? The dif­fer­ence is that the magi­cian announces the trick, right. And in fact you can even know how a trick works and appre­ci­ate a magi­cian even more­so, right? So the deceit isn’t the core of mag­ic, right, it’s the artistry.

And so I think the ques­tion is how do you announce the trick in vir­tu­al real­i­ty? You must announce the trick. And this incli­na­tion that many peo­ple seem to have to want to make the devices as invis­i­ble as pos­si­ble, or even to sort of just be in VR are all the time, strikes me as being both pre­pos­ter­ous and miss­ing the point, and also uneth­i­cal because it it then fails to announce.

I call it pre­pos­ter­ous because like, to me VR is beau­ti­ful. Like, a well-crafted vir­tu­al world expe­ri­ence can be extra­or­di­nary. And then to say, Oh, this should just be on all the time,” it’s like some­body telling me, Oh, you like clas­si­cal music? We’ll just leave it on all the time.” And, Oh, you like wine? Well you should drink it alll the time all day long.” I mean it’s just like, not the way you treat some­thing you love. It just makes no sense at all to me.

Mason: A lot of peo­ple fear that vir­tu­al real­i­ty is going to dimin­ish their expe­ri­ence of the world but you believe that it’s going to height­en per­cep­tion.

Lanier: Right. Well, back in the old days…I’m so old. But back like in the 80s, my favorite trick in giv­ing a VR demo was to put like a flower out on the table while some­body was in the VR expe­ri­ence. And then when they’d come out they’d look at this flower— You should try that. There’s this way that it sort of pops into hyper­re­al­i­ty because you start— You’ve giv­en your ner­vous sys­tem a chance to adjust to some­thing else, so it re-engages with this phys­i­cal world fresh­ened. And I think that’s the great­est joy of vir­tu­al real­i­ty, real­ly. That’s the very best thing there is.

Mason: But the sorts of VR expe­ri­ences that we’re get­ting today and the sorts of things that peo­ple think of vir­tu­al real­i­ty such as 360 video, that’s not real­ly VR, is it?

Lanier: Well, 360 video, which is anoth­er thing by the way we were doing at the time. We had a prod­uct called VideoSphere that achieved that back in the 80s. Although analog…tape…it was all very hard to do, but any­way. I think the spher­i­cal videos might turn into a genre of their own. Like that might be some­thing that per­sists but it’s impor­tant under­stand what it is and what it isn’t. I’m a lit­tle sad that it does­n’t have its own name and that it’s being called vir­tu­al real­i­ty because the prob­lem of course is that it’s not inter­ac­tive and it does­n’t— I think it real­ly does­n’t get to even the begin­ning of the core of the beau­ty of vir­tu­al real­i­ty.

But on the oth­er hand, some of them are very…they can be very good for what they are. And they can also be impor­tant. I mean I think as doc­u­men­tary and empathy-generating mech­a­nisms they have been impor­tant. There’ve been some very good ones made.

The prob­lem is they could also be very effec­tive at lying. So once again we have to get the under­ly­ing eco­nom­ics and pow­er struc­ture right or what­ev­er good they can do will be nul­li­fied. But I’m not will­ing to just diss them. I think they’re a thing in their own right. I think they’re impor­tant. It’s like say­ing oh, for­get that black and white pho­tog­ra­phy, it’s just a pass­ing phase. I actu­al­ly think black and white pho­tog­ra­phy is a thing that per­sists because it has its own beau­ty, and some of these things have integri­ty and we should­n’t just think of them as only hav­ing val­ue because they’re on the way to some­thing else.

Mason: I mean, what is the sort of vir­tu­al real­i­ty expe­ri­ences that you are at least hop­ing for? What sort of words do you want to build?

Lanier: Well, for me… First of all let me men­tion there’s a lot of great work being done now, and I like to take the oppor­tu­ni­ty to men­tion younger design­ers. And so I’ll men­tion for instance Chris Milk, who did a work called [“Life of Us”] which is real­ly a lot… It’s a lot like the sort of thing I used to love the best. Your body morphs through evo­lu­tion­ary phas­es where you turn into dif­fer­ent crea­tures. And it’s social with oth­er peo­ple and it’s got a lot of ener­gy to it. He had to do it in a way that it’s a sort of a timed, sequen­tial expe­ri­ence because it’s intend­ed to be shown with docents in pub­lic places. So it’s not the sort of thing where you explore at your own pace. But I think it’s a suc­cess­ful… People have tried it? No, okay. Well, any­way I think it’s a real­ly good one.

As far as things I want to build, there’s… Oh god, I can men­tion some­thing. I’ll men­tion some oth­er things as we go through the con­ver­sa­tion. But what I notice now is that the small inde­pen­dent design­ers are in my opin­ion doing the best work. Although, some of the big stu­dios do good things, too. But a lot of the best qual­i­ties of VR come out in the small­est details that can be done by small teams work­ing with very lit­tle expense, actu­al­ly, if they’re care­ful.

For myself, the goal I’ve always dreamed of the most is some sort of impro­visato­ry sys­tem where you’re inside and you play like vir­tu­al musi­cal instru­ments or some oth­er sort of thing. And by doing that you can change the world in any way and invent the world while you’re in it, and co-invent it with oth­er peo­ple so it becomes a shared inten­tion­al waking-state dream, as one used to say.

And the method of cre­at­ing tools that are capa­ble of that is still elu­sive. I’ve tried a lot of dif­fer­ent ways to do it, and I still believe it can be done. And I actu­al­ly have a whole thing in the book about that and the prospects for it. But that’s the thing I’d most love to see.

Mason: You talk about that prob­lem with the soft­ware that’s used to cre­ate vir­tu­al real­i­ty is often on a lap­top, on a two-dimensional lap­top, you can’t jump into VR and cre­ate just yet.

Lanier: Yeah, this thing of design­ing it in some pro­gram­ming lan­guage and then jump­ing in is…ridiculous. I mean that’s com­plete­ly miss­ing the point.

And I should men­tion some­thing else. The way a lot of the com­pa­nies have set up stores where you down­load an expe­ri­ence and then you’re expe­ri­enc­ing it, is also wrong because it should be clos­er to like Skype than to Netflix. There should be live inter­ac­tions, but also there should be a role for live per­form­ers in it. There should be a whole new world of peo­ple who are sort of like pup­peteers, or dun­geon mas­ters or what­ev­er you might call them who are impro­visato­ry per­form­ers with­in vir­tu­al worlds, where that’s actu­al­ly the main point. Because that’s just much more appro­pri­ate to the medi­um than think­ing of it as a down­load. And I think that there’s been a real cat­e­go­ry error in the nascent vir­tu­al real­i­ty indus­try on that point.

Mason: Well why do you think we’ve run down this path of try­ing to re-present real­i­ty as it is and turn it into vir­tu­al worlds? Why do all these vir­tu­al worlds looks so famil­iar? Why aren’t we cre­at­ing oth­ered expe­ri­ences?

Lanier: Well, I mean some— There are— As I say, I mean I think there are real­ly good design­er, so I should focus on them rather than the crap, of which there’s a lot. So I’ll men­tion Vi Hart. She does math expli­ca­tion vir­tu­al real­i­ty expe­ri­ences, but they’re extra­or­di­nary for learn­ing how to walk around in four-dimensional spaces or some­thing. And she has a won­der­ful group of peo­ple who build these things and they’re just fan­tas­tic.

So I’ll men­tion anoth­er one. The thing is, let’s focus on the good stuff. I mean, why are there so many bad movies? I mean my god. I don’t think there’s any like, expla­na­tion need­ed for why there’s so much crap in any giv­en medi­um. I think that’s kind of—

Mason: But what I wor­ry—

Lanier: Or maybe there is but it’s an old mys­tery, not a new one.

Mason: But where I wor­ry is peo­ple are com­ing to vir­tu­al real­i­ty and hav­ing these very kind of vapid expe­ri­ences where the agency is—especially in a 360 video—the agency is that of the direc­tor. They don’t get to have this kind of elat­ing, indi­vid­u­al­ist—

Lanier: I know. It’s a big prob­lem, you know. I’m real­ly kind of bummed about that. I’m real­ly kind of bummed that a lot of peo­ple think they’ve expe­ri­enced vir­tu­al real­i­ty, and what they’ve actu­al­ly expe­ri­enced is some­thing that was pret­ty shod­dy. And that’s a drag.

But you know what? I mean… I just think that that’s what it’s like when a new medi­um comes along. I mean I think when cin­e­ma start­ed there was a lot of stu­pid stuff that peo­ple saw first, you know. I mean that’s hon­est­ly true. We like to remem­ber the high points but, there was actu­al­ly a lot of crap.

For one of my ear­li­er books I start­ed look­ing at what books oth­er than the Bible were print­ed when the print­ing press became avail­able, and it was not all great. There was a lot of real­ly stu­pid stuff. So you know, I don’t know what moti­vates peo­ple to put all this work into mak­ing some­thing that’s real­ly shod­dy and stu­pid. But that’s true for like— You know, you look at a movie and like why—they had all this mon­ey. And then they threw it into this thing that you could tell that every­body who’s mak­ing it knows it’s a piece of crap. Like why? Why not to stop and say, Hey, we’re mak­ing a piece of crap. Let’s spend this mon­ey on mak­ing a bet­ter thing.” Like, why don’t they just stop for like— I don’t know why. I mean it’s like one of the great endur­ing mys­ter­ies. I just— I don’t get it but that’s what hap­pens.

Mason: Well there’s tru­ly great vir­tu­al real­i­ty. And not great in terms of the con­tent but great in terms of what it’s able to do to the human indi­vid­ual. So we’ve already seen that vir­tu­al real­i­ty can be effec­tive at help­ing cure PTSD. It can be effec­tive as mor­phine at pain treat­ment. You can cre­ate empa­thy—

Lanier: Can I com­ment on both of those for a sec­ond?

Mason: Please, yeah.

Lanier: Yeah. Those are inter­est­ing to me in dif­fer­ent ways. On the PTSD treat­ment, this is a case where I was ini­tial­ly real­ly cyn­i­cal about it. When peo­ple start­ed doing research in that I thought, Oh this is just too cute.” This sounds like some­body just want­i­ng head­lines. It’s like, it’ll be real­ly catchy. Oh yeah, we’ll use VR to treat these things. But the clin­i­cal results came in and they were repli­cat­ed, and it turned out to be a real thing. So that was a case where I was a lit­tle too cyn­i­cal.

And then the oth­er one you men­tioned on pain relief. One of my stu­dents who’s now—she’s just become a pro­fes­sor in the med­ical school at Cornell in New York, Andrea Won, came up with an amaz­ing thing. Can I tell you about it?

Mason: Please. You can’t set it up like that and not tell us.

Lanier: Well I don’t know. So, every­body here will know the dif­fer­ence between what we can now call clas­si­cal occlu­sive vir­tu­al real­i­ty and mixture—also known as aug­ment­ed real­i­ty. So what she did is she took a pop­u­la­tion of peo­ple who had chron­ic pain but local­ized, and put them in social set­tings with mixed real­i­ty head­sets. Meaning that they still saw the phys­i­cal world but with extra stuff. And then they would paint on their bod­ies where the pain was. And it could look like a vir­tu­al tat­too or a Band Aid or some­thing. And then it actu­al­ly stuck with them. And to get that to hap­pen is a whole tech prob­lem. It’s not gonna hap­pen with an off-the-shelf HoloLens. But any­way, it hap­pened in the lab­o­ra­to­ry set­ting.

And so they had these arti­fi­cial tat­toos that were per­sis­tent. And then grad­u­al­ly, over weeks and months, they start­ed to dis­si­pate, and we did­n’t tell them it was going to hap­pen. And then a sta­tis­ti­cal­ly sig­nif­i­cant num­ber report­ed their pain dis­si­pat­ing. Isn’t that cool? Yeah.

And then just as a reminder, every time some­body iden­ti­fies a tech­nique of that kind, pre­cise­ly the oppo­site hor­ri­ble, sadis­tic ver­sion of it is also hypo­thet­i­cal­ly avail­able. So it all comes down to incen­tives, pow­er struc­tures, ethics, soci­ety. Like, there’s no such thing as a com­put­er that’s going to be eth­i­cal on your behalf, or kind on your behalf. That’s up to you—it’s up to all of us. So this is not like some panacea sto­ry. It’s actu­al­ly a chal­leng­ing sto­ry if you under­stand the whole dynam­ic.

Mason: I was going to ask, if we’ve proven that vir­tu­al real­i­ty has these won­der­ful egal­i­tar­i­an uses and changes the mind and changes the body, then can it be used to inverse­ly not just cure trau­ma but invoke trau­ma?—

Lanier: Hell yes.

Mason: Could real­i­ty be the ulti­mate mind con­trol device?

Lanier: Yeah. Virtual real­i­ty has the poten­tial to be the creepi­est inven­tion. That’s absolute­ly cor­rect. This has also been stud­ied. So I have a co-researcher and friend named Jeremy Bailenson at Stanford. And we’ve been study­ing how chang­ing avatars changes peo­ple. So here are some things that have been not only pub­lished in peer-reviewed jour­nals but repli­cat­ed in mul­ti­ple places.

One is you can make some­body more racist with­out them real­iz­ing you did it to them. You can make some­body less con­fi­dent in a nego­ti­a­tion with­out them real­iz­ing you did it to them. You can make some­body more like­ly to buy some­thing they don’t need with­out them real­iz­ing you did it to them. And the list goes on.

Mason: So this sounds like Facebook again.

Lanier: Well see that’s the thing. That’s what must not be allowed to hap­pen. So Facebook sim­ply has to change its busi­ness plan before Oculus gets any good.

Mason: So what’s the solu­tion for that? Is it gov­ern­ment reg­u­la­tion of VR and…?

Lanier: Well, I mean I think the solu­tion is just to talk like this until they’re shamed into doing it. I know that sounds crazy. But I think that’s— No, look at look at—you know Sean Parker is with the pro­gram. And I’m work­ing on Peter Thiel and we’ll even­tu­al­ly, like this thing is going to crum­ble. Because it just makes sense. It’s just like what’s hap­pen­ing is too stu­pid for peo­ple of good will who are not stu­pid to endure. Like they have to just change it.

Mason: I want to talk— Yeah, right. And the fact that there was silence is slight­ly dis­turb­ing, espe­cial­ly from this audi­ence. I was expect­ing every­body to leave and riot now.

Look, this oth­er thing about VR… Do you think… The folks who see how effec­tive it is, they go back and say, You know, it’s just like…this thing like psy­che­delics.” You spent a lot of time I know with Timothy Leary, and they were talk­ing about cyberdelics. And that term is com­ing back into fash­ion recent­ly because it kind of proves the effi­ca­cy of VR. People say oh, it’s like a cyberdel­ic expe­ri­ence. And what you actu­al­ly get is like, 60s visu­als with 90s rave music over the top. [Lanier laughs] Well it’s…it’s ter­ri­ble.

Lanier: That sounds like Burning Man.

Mason: Do you think Timothy Leary would be dis­gust­ed by the sorts of VR expe­ri­ences today? Would he go, This is not what I meant when I said cyberdelics! This isn’t a psy­che­del­ic expe­ri­ence, this is some­one’s bad fuck­ing trip.”

Lanier: Well, I have some fun­ny sto­ries about Tim in the book. I knew Tim pret­ty well. I used to— We had a lot of dif­fer­ent— We dis­agreed a lot. I mean, in a way I had a dia­logue with him that was a lit­tle like the one with Bill Gibson, where I… I kind of felt at the end of the day that the way he’d pre­sent­ed psy­che­delics maybe had back­fired, and maybe was­n’t so great. Because he was just super utopi­an about it. And…

There’s a fun­ny sto­ry about how I met Tim. so, I’d been com­plain­ing to Tim about— I’d been com­plain­ing but we’ve been com­mu­ni­cat­ing indi­rect­ly, most­ly through rants in like under­ground zines, which is what peo­ple used to do before there was an Internet, and there’d be like this zine in the back of a book store like this and it’d be ter­ri­bly print­ed and it would have like some poet­ry and weird art and stuff, and then…anyway. So we’d been com­plain­ing to each oth­er. And so final­ly he said, Okay, lets meet.”

I said, Great!”

And he said, Well, I have to teach this course in some­thing or oth­er in Esalen Institute.” Do you all know what that is? It was this super influ­en­tial kind of very utopian—it’s still there—sort of New Age insti­tute that’s locat­ed in the world’s most beau­ti­ful loca­tion on these cliffs above the ocean with nat­ur­al hot springs. And it’s where a lot of cul­tur­al trends start­ed that are asso­ci­at­ed with 60s or alter­na­tive cul­ture. And it’s been around for for quite awhile. But just a lot of lit­tle things like work­shops you might go to, or yoga, or food you might eat. A lot of stuff start­ed there. So it’s a real­ly big for­ma­tive place, cul­tur­al­ly.

Anyway. So I said, Great. I can dri­ve down.” He was com­ing up from LA. I’ll dri­ve down and meet you at Esalen, and we can meet.”

And he said, Well, actu­al­ly… I real­ly don’t want to teach this work­shop. So what I’ve done is I’ve hired this Timothy Leary imper­son­ator [audi­ence laugh­ter] and I’m going to smug­gle him in. He’s going to—” The guy who used to run it—or still runs it—is Michael Murphy. He’s a friend of mine still. And as as soon as he’s gone, I’m going to just have the imper­son­ator do the rest of my work­shop. And what I want you to do is smug­gle me out in the trunk of your car past their guard gate so that I can get out of doing this thing. And then smug­gle me back right at the end so I can get paid.”

And I was like, Suuure,” you know? Like, fine. Like, the stuff we do for mon­ey in Silicon Valley is a lot less dig­ni­fied than that… I might as well go for it. So.

I had this real­ly super beat up jalopy that I’d had for­ev­er from New Mexico that… Oh, this is a whole— These sto­ries go on and on. But this thing, it was real­ly messed up. It did­n’t have back seats. It had hay in the back because I used it to move goats around and stuff. But the trunk was com­plete­ly filled with ear­ly com­put­ers, so I had to gath­er with a friend of mine at Stanford and we were like, dump­ing these com­put­ers that’d been worth a ton today. Like all these really—just to cre­ate enough space for him to sit there. And we were like well, would that fit Tim where we cre­at­ed this hole…

And so I went down there and sure, he fit. Although these com­put­ers fell on him and he was like, Oh! Oh!” I think an old Apple Lisa fell on his head is what hap­pened, if you know what that is.

Anyway. So I’m like all tense and like, I’ve nev­er smug­gled some­thing past a guard gate before. And I’m like, oh so like I’m gonna be in jail. This is going to be hor­ri­ble. I’m going to be a par­ty to fraud and some felony and my future’s gone… So I’m dri­ving up and there’s this guard gate and we’re dri­ving up to it. Hello, hel­lo.”

And then there’s this total­ly stoned hip­pie guy who like can’t even look up, just like, Uhhhh…”

But like, [makes zoom­ing noise] Anyway, that’s how I met Tim.

Mason: So from that meet­ing you had, you know feel­ings for how he felt about VR.

Lanier: Oh oh oh. That’s right, you asked a seri­ous ques­tion. You’re con­fus­ing me.

Mason: I’m about to give up and hand it to our audi­ence in a sec­ond, but—

Lanier: That might be wise. I mean, it’s hard to real­ly know what Tim would have said. I know… Tim was a good guy. I mean, he want­ed the best for every­body. He um… I mean, I real­ly don’t know what— I think what he might say now is that boy, all those just, you know… I mean, the real­ly inter­est­ing ques­tion­s’s like, what would some­body like William Burroughs say. Because we’ve entered into a peri­od of such dark­ness. It’s… I don’t know. I mean I think he’d be heart­bro­ken the way we all are.

Mason: But is also a real oppor­tu­ni­ty with VR to imag­ine some of these alter­na­tive busi­ness mod­els that you’re talk­ing about. So the thing with the vir­tu­al real­i­ty plat­form is that its reliant on cap­tur­ing a degree of human data. And we may get to the stage where we’re cre­at­ing these expe­ri­ences through cap­tur­ing neu­ro data and bio data. Could vir­tu­al real­i­ty be the plat­form that reim­burs­es the indi­vid­ual in exchange for that sort of data?

Lanier: I’d like it. I mean…so here’s the thing. If data comes from you, and you’re in con­trol of it, and it has an influ­ence on what you expe­ri­ence, that’s one thing. If that is meshed with influ­ence from some unseen paid per­son who gets in the mid­dle of the loop and influ­ences what comes back to you as a result of your own data, that is manip­u­la­tion and that’s the end of human­i­ty. We can­not sur­vive that.

So one of the things that’s for­tu­nate is that it’s very easy to define where the prob­lem is, because it’s an infor­ma­tion flow. There must not be an infor­ma­tion influ­ence from unknown third par­ties on how the feed­back loop works—that must be up to you. So that’s the first thing to say.

And then there’s a very nice fall­out from that, which is that’s also a path to endur­ing human dig­ni­ty as the tech­nol­o­gy gets bet­ter and bet­ter. And I’d just like to explain maybe a bit about that before we take ques­tions? Is that okay?

Mason: Please, yeah.

Lanier: Right. So what’s been hap­pen­ing now is we steal data from peo­ple, and then we use the data to feed arti­fi­cial intel­li­gence algo­rithms. But because we call them arti­fi­cial intel­li­gence algo­rithms instead of just algo­rithms, we have to cre­ate this fan­ta­sy that these things are alive, that they’re free-standing. And so then we give this mes­sage to the peo­ple we just stole from, that they’re no longer need­ed and their only hope for sur­vival as the algo­rithms get bet­ter is that they’ll become wards of the state on basic income or some­thing. And it’s a very cru­el, stu­pid, demean­ing lie.

The clear­est exam­ple I’ve found for intro­duc­ing this idea to peo­ple who haven’t heard it before is with lan­guage trans­la­tion. So the peo­ple who trans­late between lan­guages, like between English and German let’s say, they’ve seen a dras­tic reduc­tion in their life options. It’s sim­i­lar to what’s hap­pened to pho­tog­ra­phers, to record­ing musi­cians, to inves­tiga­tive jour­nal­ists, to many oth­ers. And the rea­son why is that the stu­pid lit­tle trans­la­tions, like trans­lat­ing mem­os, can now be done auto­mat­i­cal­ly. And by the way I love these auto­mat­ic ser­vices. They’re not great. I don’t want to see my book trans­lat­ed by Bing Translate. But on the oth­er hand I’ll use it for a memo or some­thing.

But here’s the thing. In order for these auto­mat­ic sys­tems to work we have to steal tens of mil­lions of exam­ple phras­es, from all over the world, in all the lan­guages, from peo­ple who don’t know they’re being stolen from, who are nev­er acknowl­edged and nev­er paid. And we’re telling those peo­ple that they deserve to not be employed any­more because they’re no longer need­ed because they’re being replaced by automa­tion, when in fact they’re still need­ed and they’re still the basis of every­thing and it’s not an automa­tion, it’s just a new, bet­ter chan­nel­ing of what they do.

So the dis­hon­esty in that is cru­el and hor­ri­ble but it’s also cre­at­ing an absolute­ly unnec­es­sary pes­simism about the future, that peo­ple won’t be need­ed when in fact they will be. It’s one of the things that gets me real­ly angry. Because there’s this dog­ma oh, it’s AI, we’re cre­at­ing AI.” That’s been at the root of my hos­til­i­ty towards like arti­fi­cial crea­tures and stuff, because it ulti­mate­ly does destroy not just human dig­ni­ty but just the basic mech­a­nisms of humans being able to make their way.

Luke Robert Mason: And on that note, we're going to hand this back to some humans. And I'm going to do two things. I'm going to hand you a different mic, Jaron, so you're a little more comfortable.

Jaron Lanier: No no no, I'm good. I'm good. Is this—can you hear me? Yeah yeah, okay. Everything's working.

Mason: Alright. The one next to you's a little louder. I might transfer you over. So are there any questions? Eva.

Eva Pascoe: Hi. Thank you very much for fantastic stories. We had the pleasure of hosting you at Cybersalon a good few years ago when one of the early Aibo dogs had just been released. And I remember you took a strong dislike to the Aibo dog.

Lanier: Mm hm.

Pascoe: What's your view of Sony bringing it back, as they just announced?

Lanier: Well, um…I think they should be smashed. They should all be destroyed. I mean— No, this is really a bad thing. I… If you believe in a robot dog, in the same breath you're believing that you're worthless in the future—that you'll be replaced. It's a form of suicide for you, for the reasons I just explained. It's economic suicide and ultimately it's spiritual suicide. And I feel that strongly about it.

Now, that said there could be special cases. So if there's somebody that for whatever reason is made happy by this device and it helps them because of their very special issues or problems, of course I'm not going to start judging people on an individual basis.

But overall it's a really bad idea. It's a bad idea in the sense that… I don't know, it's just like— It's truly a…it's an anti-human idea. I expressed a theory the other day in a piece in The Times that the reason cat videos are so popular online is that cats are a special creature in they weren't domesticated by us like dogs. They're not obedient. They're independent. They came along and demanded that they live with us, but they've kept their independence. And there's this kind of self-directedness of cats which we love, and we see it in the videos. And it's exactly the thing we're losing in ourselves. So when we see cat videos online, we're looking at the part of ourselves that's disappearing. That's the longing that's expressed. And that's precisely why you shouldn't have a robot cat.

Audience 2: Jaron, could you talk—

Lanier: Oh is that Kathlyn? [sp?]

Audience 2: Yes.

Lanier: Oh hey!

Audience 2: Hello! The musical instruments, and the user interface, and the future of art in a positive evolution of technology.

Mason: Wonderful question.

Lanier: Well, I have an addiction problem with musical instruments. I have… I don't know the count but it's well over a thousand. And I play them. And I'm just a nutcase about it. And I'm not bragging, I actually am confessing. Because it's really a sickness and it reduces the amount of air there is for my daughter and wife to breathe in our home. And it prevents us from ever being able to walk from one point to another in our home in a straight line.

But I love them for a bunch of reasons. One of them is they're the best user interfaces ever designed. They're the most eloquent, highest acuity, of things you can really become virtuosic on. Not just at performing a particular task but at improvising in a broad palette. They're by far the most expressive and open things that have been designed. That is to say compared to like a really good surgical instrument, which is really designed for a particular thing, a saxophone is designed not for a particular thing. In fact it's mostly played in a way that has nothing to do with what it was designed for.

So they have this incredible open potential inside them, and people can spend their lives getting better and better and deeper and deeper with them. And that's what I want most for computers. And I could go on and on about many other qualities of them, but what I hope is that eventually computing will be like musical instruments. It'll be this thing that people can get deeper and deeper with, that is an open window into a kind of an infinite adventure that is so seductive that it doesn't have an end. That people keep on growing and growing.

This is a really important point to me. One of the old debates in virtual reality is whether you could ever make a virtual reality system that's so good that you're done. That it fools you and that's it. And I always said well no, you couldn't. And the reason why is that people are not fixed. That as the VR systems get better, people learn to perceive better. And so it's not so much that the people are just like these fixed objects, then the VR gets better and then you're done. No, the people change in response to the VR.

And that process of people growing through their perception of technology is… Well, I used to frame it in almost apocalyptic terms. What I used to say is our job as technologists is to kickstart adventures that are so seductive that they'll seduce us away from mass suicide. That's the way I used to talk in my twenties. But I still think something like that's approximately true. That if our motivation is just more and more power, or more and more efficiency, or more and more optimization, we'll somehow just destroy ourselves. But if our adventure's one of ever greater beauty and connection and going deeper and deeper, then actually that's survivable. That can go on forever. So that's the one that has to win.

Mason: Question just here, Vinay Gupta.

Vinay Gupta: Hi, this is me calling in from the gods. So, I'm one of the blockchain folks. Although in the 90s I spent my time as a graphics programmer waiting for VR to arrive. This is your fault. What do you think of the blockchain, both the hysteria that surrounds it and also maybe the potential for long-term mature technology. How does that intersect with things like the story about payments will remove the misaligned incentives that produce surveillance capitalism?

Lanier: So. Blockchain, yeah that's been a question lately, isn't it? There are a few things to say. The first thing, I'd like to draw a very sharp distinction between blockchain and cryptocurrencies. So let me deal first with one and then the other.

So with blockchain… This sort of thing could be very important. Let's remember though that the way blockchain works is by explicitly being as inefficient as possible, alright. And so if that really scaled up to run everything, it would be an unconscionable crime against the climate. So the only way to use blockchain as we now understand it, at the scale of computation that would be needed to scale it, would be to find some survivable way. And in all seriousness maybe the blockchain servers have to be on the moon. Or somewhere, you know. And of that might be a good thing to do with the moon. And I'm not being flippant. I'm actually completely serious.

But the other thing is if you imagine some kind of nanopayment system on things like Facebook at a very vast scale, it's possible that the blockchain idea is like…emphasizes rigor too much. And we need something that's a bit more…statistical and much much much much much more efficient. So that's one thing.

The other thing is the whole blockchain thing, as do many other structures, rest on what might be thin ice because we don't have a formal proof that some of the encryption schemes are really going to withstand… I don't know. [inaudible audience comment (from Gupta?)] Yeah, I'm a little nervous about that. But that gets to be a kind of a geeky topic, alright.

But now let's move to cryptocurrencies. So here's my question to you. Why is every fucking one of the cryptocurrencies that's been launched to my knowledge a Ponzi scheme? Alright.

Gupta: But all currencies are Ponzi schemes.

Lanier: No, that's not accurate. All currencies are vulnerable to being Ponzi schemes but they aren't necessarily. And I'd refer you to Cain's to understand that. If a currency can be well regulated by well-intentioned people it doesn't have to be a Ponzi scheme. It can grow and change with a society.

But something like Bitcoin was precisely designed to be a Ponzi scheme. I mean it unfairly and vastly benefits the early arrivals. And then everybody who does well on it later, they have a lowered statistical chance of doing well. But anytime they do well they make the founders do even better. So it is totally a Ponzi scheme, but since it's so global it just takes a really long time to break compared to like, a Bernie Madoff scheme or something, which is tiny in comparison. So, they're horrible. And even the ones since Bitcoin that are sort of more enlightened in various ways are still Ponzi schemes. And there's no like, difficult mathematical mystery to how to make one that isn't a Ponzi scheme—but they're all damn Ponzi schemes.

So, the thing is if we can't get over the grandiosity and greed of people in that community, it's not worth a thing. So the question is when do we start to make these things that are actually for real?

Mason: That's what scares me about that community. They want to build Web 3.0, but they're borrowing the language of Web 2.0? to make individuals believe it's going to be an easy transition through, and I am deeply deeply concerned with friends and colleagues of mine who are investing in the currencies and not actually realizing that there's real potential and opportunity in that decentralization, but it all gets thrown up into the memetic power of what day trading currencies is. It's such a loss.

Lanier: And we might end up with a nasty surprise whenever the identity is revealed of the Bitcoin founder. I mean it could turn out to be Putin or something, so. I don't know, I mean I think it's perilous.

Mason: Any other questions at all?

Trudy Barber: Yes, it's me, Trudy!

Mason: Oh, it's Trudy Barber, the UK's leading cybersexpert. [crosstalk] Make this a good one.

Barber: Sorry! In the early nineties as an undergraduate at Central Saint Martins, I created one of the earliest virtual sex environments, making a complete three-dimensional digital space, with floating condoms and all sorts of things in it. What I'm quite interested in is as we're seeing ideas of you know, the 360 porn industry—which I don't actually see as some kind of virtual sex experience, I see it as a bit of a gimmick. I'm quite interested in how we are perceiving our sexual identities, or the potential of perceiving our sexual identities, within a proper 3D virtual space where you have interaction, where you have immersion, and how we perceive kind of…that whole idea of the phantom phallus, or the phantom vagina. And the way that we perceive our gendered experience of the world, and how we could really experience and experiment with that in virtual spaces. So what's your opinion on virtual sex in that context?

Lanier: Yeah. Well, alright so, this is another huge topic that could take a whole week or something so I can only barely prick the surface of.

Barber: Ooh!

Mason: Anything you say now, Jaron, this audience is a British audience. They're going to read innuendo everywhere.

Lanier: I was told that Brits were very delicate and unable to…

Mason: You haven't met this audience.

Lanier: Okay. So, I'll say a few things. Let me start with this sort of… Just the— God, how do I even begin here? So what I— Just in the— Oh God. Okay, look. What I think has happened is our ideas about sexuality of the last…going back to the late 1900s, with the very earliest moving images, with heliotropes and all, is we've had this very strange kind of artifact orientation around sexuality that turned into porn. Which is… I think it's hard to remember that things weren't always that way, which is clear from historical reading.

Now, there's been been literary porn going back to ancient times and all sorts of stuff and it's very interesting to read some of that. But there's this very particular reflection of cinema, which I think has narrowed people. I think it's created a set of clichés that people grow up with that is unfortunately small, you know. It's unfortunately limited. And I think in the future we might look back on the cinematic period as one that was profoundly anti-erotic. Because we were stuck in this feedback loop with the things that were easy to film that we would see when we're young, and that sort of thing

Now, I'm not sure that's true but I suspect that's true. And I should also point out that in pre-cinematic literature, a lot of things about gender and sexual identity are in fact more fluid. In ancient literature and even fairly recent literature. I think that cinema had a sort of a cementing effect on us psychically, in a lot of ways. So that might be— I hope that doesn't sound offensive to any filmmakers in the audience, but.

So when I was young, like in my twenties, I was really fascinated by erotic ideas in virtual reality. But I always thought that representation of like, body parts or something would be the least of it. And I was really interested in like, joining bodies with somebody interactively, trying to control a shared body and learning how to move together. Which is… You can do a little bit dancing and a little bit different ways, but not like this. This is something else, and it's something really extraordinary that very few people experience these days because I just think for some reason they're wasting time on stupider stuff.

Barber: Where would you see haptics engaging in this?

Lanier: Yeah, that's an interesting question. You know, I was talking about the cinematic era having an effect on sexuality, and I suspect that that's been somewhat more pronounced for male sexuality than for female sexuality. And there's a whole long discussion one could have about why that might be so if it's so and all that. And I think as far as the intersection of technology and sexuality, for women it's already been more haptic. And I…

So one might predict that that's likely to continue and… In general the haptic modality's the one that's the crudest and needs the most work in virtual reality. And it always— Like I've seen this again and again and again and again. Every time the stars line up and there's a bunch of funding for virtual reality work, whether it's in a corporation or university or something, the lion's share goes to vision. And then the next biggest portion goes to sound. And then poor haptics, which is the hardest one that really should be at the front of the list, gets kind of like the crumbs. And it's frustrating. It's exactly backwards, [crosstalk] but that's just a repeated problems.

Barber: There is the datafication of haptics starting to happen now, with the recording of sexual responses. And I think maybe that might be a different way forwards, combined with the virtual immersion, where the datafication of pleasure might actually expose different ways that we actually engage with our bodies. And also how it becomes like a commodity that is sold on. Which fits in with some of the other arguments that you say.

Lanier: Well, perhaps so. I mean, the concept of pleasure is one of the— Things like pleasure, and happiness is another one… I wonder if in the future we'll understand these words to be…much broader and more process-oriented than we do now. Because I have this feeling that people tend to think of pleasure as a sort of…a destination, and happiness as a destination, almost like a formula that's been fulfilled or equation that's been solved. And I'm sure that that's the wrong way to think. And so what I hope is that will expand into much more of an ongoing, infinite process.

One of the books I used to quote all the time in the 80s, and I think I still mention it in the new book—I can't remember anymore—is called Finite and Infinite Games by James B. Carson. He proposed that just as a— In a first broad-brush way of understanding reality you can divide processes into finite or infinite games, with a finite game being something like a particular…well, a game of football, that has to come to an end. Whereas the overall field of football is infinite and need not come to an end. And so you have to understand which things are finite and which things are infinite. And right now the way we're approaching technology on many levels is finite. It has to come to an end, which would mean our end. But if we can coax it into its infinite cousin, then we have a means of survival. And that's true on every level—economically, aesthetically, and on every level.

Barber: Thank you.

Lanier: Sure, thanks for the question.

Mason: Any other questions at all?

Audience 5: Hi. I love the way your opening gambit was to ask us if you'd read a particular book. Which I had, by the way. And you were a bit disappointed that most of us hadn't. And throughout your talk, actually you talked about books and reading a lot, which might come a surprise to people who think it was going to be much more visual-oriented—

Lanier: Well I am hoping you'll buy a book. I mean, that's why I'm— Like I mean. You're witnessing abject corruption here. And you know, don't don't pretend it's like some elevated state of consciousness. This is raw American salesmanship. [inaudible comment from A5] Yeah okay, go ahead.

Audience 5: So what I wanted to ask you, if it's not too personal, is what are you reading at the moment?

Lanier: Um. Yeah no, that's a great question. I've been just reading the Lapham's Quarterly special edition on music, which has all sorts of really interesting ethnomusicology scholarship that I'd never run across before. All sorts of obscure things, and and those are really really fun. And I've been reading… Well, Masha Gessen's book on the decline of Russia, which is a very very sad and terrifying thing to read. And you know, impressive and wonderful but not easy. And I've been reading… Let's see. I've been reading Roger Penrose's book on fads in physics. And I've been reading um… God, there's so much stuff.

I mean, I love reading. I just—I adore reading. And the thing about, I mean… I sometimes wonder what—you know. We have a little bit in ancient literature about people who were skeptical of reading when it was still somewhat fresh and novel. That you know, we've been warned that it'll ruin memory. That it'll make people weak. And you know what, I think that's all true. I think that there's a dark side to reading. But one of things I love about it is people sorted it out. Like, there's been Mein Kampf, there's been horrible books, books have been used to manipulate people. And yet we've sorted it out, where overall books are this wonderful thing. So as with musical instruments they're another inspiration for what must happen with computation.

Mason: We probably have time for one or two more questions.

Audience 6: Hello.

Lanier: Hey!

Audience 6: It's great to be here. I want to kind of draw on something that you briefly mentioned, and that's the idea of empathy, which is becoming particularly prevalent, particularly in the marketing of VR as an empathy machine. And I kind of wondered why you think we should kind of draw the line with that. For example, Mark Zuckerberg saying he felt when he went to Puerto Rico via VR, saying that he felt like he was actually there. And this kind of thing that's kind of perpetrating in order to sort of market VR as a consumer product. So yeah, I was kind of wanting what your thoughts are on the rhetoric of the empathy machine.

Lanier: Well…sadly what happened in this case is he was, uh— They did this sort of pretty basic cartoonish thing where they had a circle video of being in a flooded and destroyed part of Puerto Rico, with a couple of Facebook executives in as avatars, talking about "how magical and amazing it was to be there and it felt like it was really there!" And then afterwards he apologized, saying, "Oh, I forgot to mention that this was supposed to be about empathy for the people there." Like he just totally spaced that part out at first and then had to correct himself later.

So yeah, the whole empathy in VR thing used to be my spiel. I kind of started that language. And I mean it can go either way. The truth is there's nothing automatically empathetic about VR technology. I mean like, what I'm concerned about with the rhetoric of selling it as empathy right now… I mean they stole my shit, that's fine. But the thing that really bothers me is that it's suggesting that the technology itself is going to provide empathy, which is…ridiculous. I mean, for god sakes it's like a gadget you're buying from some company. It's not going to be empathic for you. Like that whole notion that empathy comes from the technology is wrong. If technology can help a little bit at the margins to help people express empathy or find empathy, that's great, but it's a human thing. And so that miscommunication is just stupid. And this Facebook thing in Puerto Rico was like a great example of how it can go wrong.

There are a few advantages and a few problems to VR as an empathy device. The advantages are that it can convey aspects of human experience that can't be conveyed by other media. It can convey what it's physically like to walk in somebody else's shoes. For instance there've been virtual worlds to convey what it's like to have different physical disabilities or whatever, and I think that that's really interesting work.

The bad side of it is that if you capture the world suitably to replace something about it in virtual reality, you've captured enough of it that it's easier to fake than if it was just a photograph. And so this is a bit of a subtle technical problem. But if you take a photograph and you want to fake it, if you Photoshop it, there's some things that are very…there are traces that are very hard to fully cover. So you can write programs to find the little weird patterns that are left over from that operation. So there's at least a little bit of a film of protection if you can get at the original digital file. Of course if somebody's just done a low-res version of it or something, then at a certain point you just are stuck.

But if you've really captured a virtual world… Like, let me give you an example. Let's say you take a photo of something and you want to change what appeared to happen so that you're trying to generate an outrage machine based on shitposting fake news instead of something real. Which is the more typical thing these days. That's more common than an attempt to tell the truth. Lying has been amplified like a million times while the truth has been only amplified by like three times or something. It's like a complete inequity.

And so let's say you're trying to lie by changing a photograph. Well, you have to look where all the shadows lie. You have to make it consistent. And there's not going to be any algorithm that does that completely perfectly. You have to sit there and look at it.

But if you've captured it volumetrically, then all you have to do is move the sun. I mean, you all of a sudden have turned it into a simulation, and a simulation is parameterized and so it's easier to change. So suddenly you can lie with it better.

So, the empathy thing is real. I spent many hours promoting the idea of virtual reality as an empathy machine in the 80s. It's legit in a sense, but it's all about you. The machine can't do a damn thing for you.

And I… I don't know. This idea— I don't— Techies just don't seem to be able to…take a human-centric approach sometimes. They just really have to think of the tech as being where the action is. And it never never never is. It isn't even real without us.

Mason: So, Jaron, I'm going to ask you one final question. Because the Virtual Futures audience span about twenty-five years. It's an intergenerational audience. And how can we better work together to change some of the issues that you raised, and together create a more desirable future?

Lanier: Well—

Mason: Or are we just…fucked? We just…there's no hope.

Lanier: You know, the speaking truth to power thing is still legit. Like, just… I mean I think in a way it's just sort of pointing it out, really helps, you know. I'd be either trying to stop using social media, or use it very carefully and differently than you have or something. All the people trying to change the world with social media, as I've explained earlier, are completely undoing themselves at every turn and you have to learn to overcome that addictive illusion. And I think you have to turn to other institutions that are still standing.

And I'm worried about us saving our governments. I'm just afraid that it's become so distorted that it might—that we're all falling apart here. So I think just trying to save governments is a good project right now. We used to worry about them being too powerful and now I feel like they're so fragile they could just vanish.

Mason: Well is it a case of what expands contracts? I mean, the technocracy is the replacement of the government and we're hoping that the government now is going to be the thing that fixes the technocracy, and we're just going to continue to see this expansion and contraction between the government, technocracy, government, technocracy.

Lanier: Yeah, I don't know. I mean it's a funny thing. If we think of the tech companies as the replacement for government… Which a lot of people are doing. I mean it's true to some degree already. I mean, the good news is that the people who run the tech companies are still kind of relatively young and they're almost all really nice. I mean…they're good people, you know. I mean like, in a choice between Zuck and Trump I mean obviously Zuck, right? I mean there's not even a…

But the problem is that the amount of wealth and power that is concentrated in the tech companies is so great that we can't just think about the current batch of cute kids there, you know. We have to think about who'll inherit the power. And when power is concentrated it tends to be inherited by less attractive figures. So, Bolsheviks were cuter than Stalinists, let's say, right. And so with Facebook, it's the first large public corporation controlled by a single person, and that has to be inherited somehow. There has to be some kind of way that turns to something else. And there are all the scheming, horrible people in the world, you know, and who owns it like in a hundred years if it continues to be what it is?

So I think that the tech companies, as government, are the least representative governments ever, you know. I mean even less so than a royal—like a… I mean, I think even like someplace with a royal family like Saudi or something still has to be a little conscious of whether the people are going to totally revolt. Whereas something like Facebook has a kind of degree of impunity because everybody's addicted. It's a completely different relationship that favors them in a way that citizenship doesn't necessarily favor a state.

Mason: Well, on that note I hope there's at least a little hope. And hopefully—

Lanier: I'm full of hope. I just uh…I'm a realist, though. I mean I think you can be a hopeful optimistic realist, and it's just…it's just work, that's all.

Mason: So, hopefully this audience will keep doing exactly what you said. Keep having these issues out in public. And on that note I want to have a couple of thank yous. Firstly to the Library Club for hosting us. The Library Club, they are a wonderful venue. They're very kind to us for having us here.

And a massive thank you to Cybersalon and to Eva Pascoe. And it's because of Cybersalon— This was a free invite-only event, and because of Cybersalon we're able to cover our film costs. And everything that's going to be produced and released under Creative Commons so you can remix however you like.

Lanier: Part of what's destroying the world as I pointed out.

Mason: Really? Alright, I'll stick it behind a paywall and you can all pay us more.

Lanier: No no no no. You'd have to pay them because they created it.

Mason: Alright, I'll pay you to watch the video. [audience laughter] Ugh. So look, we're entirely audience-funded, and I don't know what you're going to say about Patreon but if you like what we do [Lanier laughs] please support us. I don't know, just…buy me a beer?

And you can find out more about Virtual Futures at "virtualfutures" pretty much anywhere online. And of course a massive thank you to Penguin Books who're on the balcony. And Jaron's signed a bunch of books that're available for sale. So please don't all rush up there because they're gonna get hounded. But the books are available and for sale at the back. And thank you to the Penguin team for making this possible.

And I want to end with this, which is how we end every single Virtual Futures. And it's with a warning. Because the future is always virtual, and some things that may seem imminent or inevitable never actually happen. Fortunately, our ability to survive the future is not contingent on our capacity for prediction, although, and in those much more rare occasions, something remarkable does come of staring the future deep in the eyes and challenging everything that it seems to promise. I hope you feel you've done that today, and assuming that this isn't an impersonator, please put your hands together and join me in thanking the incredible Jaron Lanier.

Help Support Open Transcripts

If you found this useful or interesting, please consider supporting the project monthly at Patreon or once via Square Cash, or even just sharing the link. Thanks.