Cory Doctorow: Thank you very much. Hi. So, there’s lit­tle for­mal­i­ty first. As a mem­ber in good stand­ing of the Order of After-Dinner and Conference Speakers of England and Wales, I am required as the last speak­er before lunch to make a joke about being the last speak­er before lunch. This is that joke. Thank you. 

So, I work for Electronic Frontier Foundation. And I’ve met some of you in the halls here, and when I men­tion I work with EFF they say, Oh you guys have been around for a long time.” And it’s true. Like not just Internet time. Not like this is Zcash’s sec­ond birth­day so we’re the dod­der­ing old men of cryptocurrency”-long time. Like, we’ve been around for a quar­ter cen­tu­ry. A legit­i­mate long time. And I want to talk about our ori­gin sto­ry, about the key vic­to­ries that we scored real­ly ear­ly on, a quar­ter cen­tu­ry ago, that real­ly are the rea­son you guys and you folks are in this room today. 

A grid of logo images for various cryptocurrencies

I want to talk about the Crypto Wars. Not those cryp­to wars. 

ASCII meme image of a man jumping from a building yelling "Crypto stands for cryptography"

These Crypto Wars.

So, back in the late 90s, the NSA classed cryp­tog­ra­phy as a muni­tion and imposed strict lim­its on civil­ian access to strong cryp­to. And there were peo­ple as you heard Primavera speak about who called them­selves cypher­punks, cryp­toa­n­ar­chists, who said that this was bad pol­i­cy; it was a gov­ern­men­tal over­reach and it need­ed to be changed. And they tried a whole bunch of dif­fer­ent tac­tics to try and con­vince the gov­ern­ment that this pol­i­cy was not good policy. 

So, they talked about how it was inef­fec­tive, right. They said you can ban civil­ian access to strong cryp­tog­ra­phy and only allow access to weak cryp­to, the 50-bit ver­sion of DES, and that will not be suf­fi­cient to pro­tect peo­ple. They made this as a tech­ni­cal argu­ment. They said like, Look, we believe that you could brute force DES with con­sumer equip­ment.” And the court said, Well, who are we gonna believe, you or the NSA? Because the NSA, they hire all the PhD math­e­mati­cians that grad­u­ate from the Big 10 schools and they tell us that DES 50 is good enough for any­one. So why should we believe you?”

A circuit board with a grid of many processors on it

And so, we did this. We built this thing called the DES Cracker. It was a quarter-million-dollar spe­cial­ized piece of equip­ment that could crack the entire key space of DES in two hours, right. So we said like, Look. Here’s your tech­ni­cal proof. We can blow through the secu­ri­ty that you’re propos­ing to lock down the entire US finan­cial, polit­i­cal, legal, and per­son­al sys­tems with, for a quarter-million dollars. 

And they said, Well, maybe that’s true but we can’t afford to have the crim­i­nals go dark, right. They’re gonna hide behind cryp­to and we won’t be able to spy on them.”

So in the face of all that resis­tance, we final­ly came up with a win­ning argu­ment. We went to court on behalf of a guy named Daniel J. Bernstein. You prob­a­bly heard of DJB. He’s a cryp­tog­ra­ph­er. He’s a cryp­tog­ra­ph­er whose name is all over all the ciphers you use now. But back then DJB was a grad stu­dent at the University of California at Berkeley. And he had writ­ten a cipher that was stronger than DES 50. And he was post­ing it to Usenet. And we went to the Ninth Circuit and we said, We believe that the First Amendment of the US Constitution, which guar­an­tees the right to free speech, pro­tects DJB’s right to pub­lish source code.” That code is a form of expres­sive speech as we under­stand expres­sive speech in the US Constitutional framework. 

And this worked, right. Making tech­ni­cal argu­ments did­n’t work. Making eco­nom­ic argu­ments did­n’t work. Making law enforce­ment argu­ments did­n’t work. Recourse to the Constitution worked. We won in the Ninth Circuit. We won at the Appellate Division. And the rea­son that you folks can do ciphers that are stronger than DES 50, which these days you can break with a Raspberry pi, the rea­son you can do that is because we won this case. [applause] Thank you.

So I’m not say­ing that to suck up to you, right. I’m say­ing that because it’s an impor­tant note in terms of tac­ti­cal diver­si­ty in try­ing to achieve strate­gic goals. It turns out that mak­ing recourse to the Constitution is a real­ly impor­tant tac­ti­cal arrow to have in your quiver. And it’s not that the Constitution is per­fect. And it’s cer­tain­ly not true that the US always upholds the Constitution, right. All coun­tries fall short of their goals. The goals that the US falls short of are bet­ter than the goals that many oth­er coun­tries fall short of. The US still falls short of those goals and the Constitution is not perfect. 

Headshot of Lawrence Lessig captioned "I'm not trying to get all Lawrence Lessig on you."

And you folks, you might be more com­fort­able think­ing about deploy­ing math and code as your tac­tic, but I want to talk to you about the full suite of tac­tics that we use to effect change in the world. And this is a frame­work that we owe to this guy Lawrence Lessig. Larry is the founder of Creative Commons and has done a lot of oth­er impor­tant stuff with cyber law and now works on cor­rup­tion. That’s a con­nec­tion I’m gonna come back to. And Larry says that there are four forces that reg­u­late our world, four tac­ti­cal avenues we can pursue. 

There’s code: that’s what’s tech­ni­cal­ly pos­si­ble. Making things like Deep Crack. 

There’s mar­kets: what’s prof­itable. Founding busi­ness­es that cre­ate stake­hold­ers for strong secu­ri­ty turned out to be a real­ly impor­tant piece to con­tin­u­ing to advance the cryp­to agen­da because there were peo­ple who would show up and argue for more access to cryp­to not because they believed in the US Constitution but because their share­hold­ers demand­ed that they do that as part of their ongo­ing funding. 

There’s norms: what’s social­ly accept­able. Moving from the dis­cus­sion of cryp­to as a thing that exists in the realm of math and pol­i­cy, and to a thing that is part of what makes peo­ple good peo­ple in the world to con­vince them that for exam­ple allow­ing sen­si­tive com­mu­ni­ca­tions to go in the clear is a risk that you put not just on your­self but on the counter par­ties to your com­mu­ni­ca­tion. I mean I think we will even­tu­al­ly arrive at a place where send­ing sen­si­tive data in the clear will be the kind of tech­ni­cal equiv­a­lent of invit­ing peo­ple to a par­ty where you close the door and chain smoke, right. It’s your self­ish lazi­ness putting them at risk.

And then, there’s law: what’s legal. 

Now, the rule of law is absolute­ly essen­tial to the cre­ation and main­te­nance of good cipher sys­tems. Because there is no keylength, there’s no cipher sys­tem, that puts you beyond the reach of law. You can’t audit every moth­er­board in every serv­er in the cloud that you rely on for a lit­tle back­door chip the size of a grain of rice that’s tapped right into the moth­er­board con­trol system. 

You can’t make all your friends adopt good oper­a­tional secu­ri­ty. This is a bit of the rules used by the deep pack­et inspec­tion sys­tem deployed by the NSA. This was pub­lished in a German news­pa­per after it was leaked to them. The deep pack­et inspec­tion rules that the NSA was using to decide who would get long-term reten­tion of their com­mu­ni­ca­tions and who would­n’t, they involved look­ing for peo­ple who had ever searched for how to install Tor or Tails or Cubes. So if you had ever fig­ured out how to keep a secret, the NSA then start­ed stor­ing every­thing you ever sent. In case you ever com­mu­ni­cat­ed with some­one who was­n’t using cryp­to and through that con­veyed some of the things that was hap­pen­ing inside your black box con­ver­sa­tions. You can’t make every­body you will ever com­mu­ni­cate with use good cryp­to. And so if the state is will­ing to exer­cise ille­git­i­mate author­i­ty, you will even­tu­al­ly be found out by them. 

You can’t audit the ciphers that every piece of your tool­chain uses, includ­ing pieces that you don’t con­trol that are out of your hands and in the hands of third par­ties. One of the things we learned from the Snowden leaks was that the NSA had sab­o­taged the ran­dom num­ber gen­er­a­tor in a NIST stan­dard in order to weak­en it so that they could back­door it and read it. And so long as the rule of law is not being obeyed, so long as you have spy agen­cies that are unac­count­ably run­ning around sab­o­tag­ing cryp­to stan­dards that we have every rea­son to believe oth­er­wise are sol­id and sound, you can nev­er achieve real secu­ri­ty. This turns out to be part of a much larg­er thing called Bullrun in the US and Edgehill in the UK that the NSA and MI5 were joint­ly doing to sab­o­tage the entire cryp­to tool­chain from hard­ware to soft­ware to stan­dards to ran­dom num­ber generators. 

Opsec is not going to save you. Because secu­ri­ty favors attack­ers. If you want to be secure from a state, you have to be per­fect. You don’t just have to be per­fect when you’re writ­ing code and check­ing it in. You have to be per­fect all the time. You have to nev­er make a sin­gle mis­take. Not when you’re at a con­fer­ence that you trav­eled across the ocean to when you’re hor­ri­bly jet lagged. Not when your baby has wok­en you up at three in the morn­ing. Not when you’re a lit­tle bit drunk—you have to make zero mistakes. 

In order for the state to pen­e­trate your oper­a­tional secu­ri­ty, they have to find one mis­take that you’ve made. And they get to cycle a new shift in every eight hours to watch you. They get to have some­one spell off the per­son who’s start­ing to get screen burn-in on their eyes and has to invert the screen because they can no longer focus on the let­ters. They just send some­one else to sit down at that con­sole and watch you. So your oper­a­tional secu­ri­ty is not going to save you. Over time the prob­a­bil­i­ty that you will make a mis­take approach­es one. 

So cryp­to is not a tool that you can use to build a par­al­lel world of code that immu­nizes you from an ille­git­i­mate, pow­er­ful state. Superior tech­nol­o­gy does not make infe­ri­or laws irrelevant. 

But tech­nol­o­gy, and in par­tic­u­lar pri­va­cy and cryp­to­graph­ic tech­nol­o­gy, they’re not use­less. Just because your opsec won’t pro­tect you for­ev­er does­n’t mean that it won’t pro­tect you for just long enough. Crypto and pri­va­cy tools, they can open a space in which, for a lim­it­ed time, before you make that first mis­take, you can be shel­tered from that all-seeing eye. And in that space, you can have dis­cus­sions that you’re not ready to have in pub­lic yet. Not just dis­cus­sions where you reveal that your employ­er has been spy­ing on every­one in the world, but all of the dis­cus­sions that have brought us to where we are today. You know, it’s remark­able to think that with­in our life­times, with­in liv­ing mem­o­ry, it was ille­gal in much of the world to be gay. And now in most of those ter­ri­to­ries gay peo­ple can get mar­ried. It was ille­gal to smoke mar­i­jua­na and now in the coun­try I’m from, Canada, mar­i­jua­na is legal, right, in every province of the coun­try. It was ille­gal to prac­tice so-called inter­ra­cial mar­riage. There are peo­ple who are the prod­ucts of those mar­riages, who were illegal. 

So how in our life­times did we go from these regimes where these activ­i­ties were pro­hib­it­ed, to ones in which they are embraced and con­sid­ered nor­mal? Well it was because peo­ple who had a secret that they weren’t ready to talk about in pub­lic yet could have a space that was semi-pub­lic. Where they could choose their allies. They could find peo­ple who they thought they could trust with a secret. And they could whis­per the true nature of their hearts to them. And they could recruit them into an ever-growing alliance of peo­ple who would stand up for them and their prin­ci­ples. They could whis­per the love that dare not speak its name until they were ready to shout it from the hills. 

And that’s how we got here. If we elim­i­nate pri­va­cy and cryp­tog­ra­phy, if we elim­i­nate the abil­i­ty to have these semi-pub­lic con­ver­sa­tions, we won’t arrive at a place in which social progress con­tin­ues any­way. We’ll arrive at a place that will be much like the hun­dreds of years that pre­ced­ed the legal­iza­tion of these activ­i­ties that are now con­sid­ered nor­mal. Where peo­ple that you love went to their graves with secrets in their hearts that they nev­er con­fessed to you. Great aches that you had unknow­ing­ly con­tributed to, because you nev­er knew their true selves. 

So we need good tech pol­i­cy, and we’re not get­ting it. In fact we’re get­ting bad tech­nol­o­gy pol­i­cy that’s get­ting worse by the day. 

Screenshot of what was displayed to victims of the 2017 hospital ransomware attack, detailing when payment was expected, would go up, and files would possibly be deleted

So, you may remem­ber that over the last two years we dis­cov­ered that hos­pi­tals are com­put­ers that we put sick peo­ple into. And when we take the com­put­ers out of the hos­pi­tals, they cease to be places where you can treat sick peo­ple. And that’s because of an epi­dem­ic of ran­somware. There’s been a lot of focus on the bad IT poli­cies of the hos­pi­tals. And the hos­pi­tals had some bad IT poli­cies. You should­n’t be run­ning Windows XP, there’s no excuse for it and so on. 

But, ran­somware had been around for a long time and it had­n’t tak­en down hos­pi­tals all over the world. The way that ran­somware end­ed up tak­ing down hos­pi­tals all over the world is some­body took some off-the-shelf ran­somware and mar­ried it to a thing called Deep Blue. Or EternalBlue, rather. And EternalBlue was an NSA exploit. They had dis­cov­ered a vul­ner­a­bil­i­ty in Windows XP, and rather than tak­ing it to Microsoft and say­ing, You guys had bet­ter patch this because it’s a real­ly bad zero-day,” they had just kept it secret, in their back pock­et, against the day that they had an adver­sary they want­ed to use it against. 

Except before that could hap­pen, some­one leaked their cyber weapon. And then dum­dums took the cyber weapon and mar­ried it to this old piece of ran­somware and start­ed to steal hos­pi­tals. Now, why do I call these peo­ple dum­dums? Because the ran­som they were ask­ing for was $300. They did­n’t even know that they’d stolen hos­pi­tals. They’re just oppor­tunis­ti­cal­ly steal­ing any­thing that was con­nect­ed to an XP box and then ask­ing for $300, in cryp­tocur­ren­cy, in order to unlock it. 

So, this is not good tech­nol­o­gy pol­i­cy. The NSA believes in a doc­trine called NOBUS: No One But US is smart enough to dis­cov­er this exploit.” Now, first of all we know that’s not true. We know that the NSA… From the Crypto Wars, we know that the NSA does not have a monop­oly on smart math­e­mati­cians, right. These were the peo­ple who said DES 50 was strong enough for any­one. They were wrong about that, they’re wrong about this. But even if you believe that the NSA would never…that the exploits that they dis­cov­ered would nev­er be inde­pen­dent­ly redis­cov­ered, it’s pret­ty obvi­ous that that does­n’t mean that they won’t be leaked. And once they’re leaked, you can nev­er get that tooth­paste back in the tube. 

Now, since the Enlightenment, for 500 years now, we’ve under­stood what good knowl­edge cre­ation and tech­nol­o­gy pol­i­cy looks like. So let me give you a lit­tle his­to­ry les­son. Before the Enlightenment, we had a thing that looks a lot like sci­ence through which we did knowl­edge cre­ation. It was called alche­my. And what alchemists did is a lot like sci­en­tists. You observe two phe­nom­e­na in the uni­verse. You hypoth­e­size a causal rela­tion­ship, this is mak­ing that hap­pen. You design an exper­i­ment to test your causal rela­tion­ship. You write down what you think you’ve learned. 

And here’s where sci­ence and alche­my part ways. Because alchemists don’t tell peo­ple what they think they’ve learned. And so they are able to kid them­selves that the rea­son that their results seem a lit­tle off is because maybe they made a lit­tle mis­take when they were writ­ing them down and not because their hypoth­e­sis was wrong. Which is how every alchemist dis­cov­ers for him­self the hard­est way pos­si­ble that you should not drink mer­cury, right. 

So for 500 years, alche­my pro­duces no div­i­dends. And then alchemists do some­thing that is legit­i­mate­ly mirac­u­lous. They con­vert the base met­al of super­sti­tion into the pre­cious met­al of knowl­edge, by pub­lish­ing. By telling oth­er peo­ple what they know. Not just their friends who’ll go easy on them but their ene­mies, right, who if they can’t find a sin­gle mis­take in their work, they know that their work is good. And so, as a first prin­ci­ple when­ev­er you’re doing some­thing impor­tant every­one should be able to crit­i­cize it. Otherwise you nev­er know that it works. So you would hope that that’s how we would oper­ate in the infor­ma­tion secu­ri­ty realm. But that’s not how we’re operating.

In 1998 Congress passed this law, the Digital Millennium Copyright Act. They then went to the European Union in 2001 and arm-twisted them into pass­ing the European Union Copyright Directive. And both of these laws have a rule in them that says that you’re not allowed to break dig­i­tal rights man­age­ment. You’re not allowed to bypass a sys­tem that restricts access to a copy­right­ed work. 

And in the ear­ly days, this was pri­mar­i­ly used to stop peo­ple from mak­ing region-free DVD play­ers. But now, every­thing’s got a copy­right­ed work in it, because every­thing’s got a sys­tem on a chip in it that costs twenty-two cents and has 50,000 lines of code includ­ing the entire Linux ker­nel and usu­al­ly an instance of BusyBox run­ning with the default root pass­word of admin/admin”.

And because that’s a copy­right­ed work, any­one who man­u­fac­tures a device where they could make more mon­ey if they could pre­scribe how you use that device, can just add a one-molecule-thick lay­er of DRM in front of that copy­right­ed work. And then because in order to recon­fig­ure the device you have to remove the DRM, they can make remov­ing DRM and thus using your own prop­er­ty in ways that ben­e­fit you into a felony pun­ish­able by a five-year prison sen­tence and a $500 thou­sand fine. 

And so there’s this enor­mous temp­ta­tion to add DRM to every­thing and we’re see­ing it and every­thing. Pacemakers, vot­ing machines, car engine parts, trac­tors, implant­ed defib­ril­la­tors, hear­ing aids. There’s a new closed-loop arti­fi­cial pan­creas from Johnson & Johnson…it’s a con­tin­u­ous glu­cose mon­i­tor mar­ried to an insulin pump with some machine learn­ing intel­li­gence to fig­ure out what dose you need from moment to moment. And it uses pro­pri­etary insulin car­tridges that have a lay­er of DRM in them to make sure that to stay alive you only feed your inter­nal organ the mate­r­i­al that the man­u­fac­tur­er has approved, so that they can charge you an extreme markup. 

So that’s bad. That’s the rea­son we’re see­ing DRM every­where. But the effect of that is what it does to secu­ri­ty research. Because under this rule, mere­ly dis­clos­ing defects in secu­ri­ty that might help peo­ple bypass DRM also expos­es you to legal jeop­ardy. So this is where it starts to get scary, because as micro­con­trollers are per­me­at­ing every­thing we use, as hos­pi­tals are turn­ing into com­put­ers we put sick peo­ple into, we are mak­ing it hard­er for crit­ics of those devices to explain the dumb mis­takes that the peo­ple who made them have made. We’re all drink­ing mercury. 

And this is going every­where. Particularly, it’s going into your brows­er. So, sev­er­al years ago the W3C was approached by Netflix and a few of the oth­er big enter­tain­ment com­pa­nies to add DRM to HTML5 because it was no longer tech­ni­cal­ly sim­ple to DRM in browsers because of the way they were chang­ing the APIs. And the W3C said that they would do it. And there’s a—it’s a long, com­pli­cat­ed sto­ry why they went into it. But I per­son­al­ly and EFF, we had a lot of very spir­it­ed dis­cus­sions with the W3C lead­er­ship over this. And we warned them that we thought that the com­pa­nies that want­ed to add DRM to their browsers did­n’t want to just pro­tect their copy­right. We thought that they would use this to stop peo­ple from dis­clos­ing defects in browsers. Because they want­ed to be able to not just con­trol their copy­right but ensure that there was­n’t a way to get around this copy­right con­trol system. 

And they said, Oh no, nev­er. These com­pa­nies are good actors. We know them. They pay the mem­ber­ship dues. They would nev­er abuse this process to come after secu­ri­ty researchers who were mak­ing good faith, hon­est, respon­si­ble dis­clo­sures,” what­ev­er; you add your adjec­tive for a dis­clo­sure that’s made in a way that does­n’t make you sad, right. There are all these dif­fer­ent ways of talk­ing about secu­ri­ty disclosures. 

And we said alright, let’s find out. Let’s make mem­ber­ship in the W3C and par­tic­i­pa­tion in this DRM com­mit­tee con­tin­gent on promis­ing only to use the DMCA to attack peo­ple who infringe copy­right, and nev­er to attack peo­ple who make secu­ri­ty dis­clo­sures. And the entire cryp­tocur­ren­cy com­mu­ni­ty and blockchain com­mu­ni­ty who were in the W3C work­ing groups, they backed us on this. In fact it was the most con­tro­ver­sial stan­dards vote in W3C his­to­ry. It was the only one that ever went to a vote. It was the only one that was ever appealed. It was the only one that was ever pub­lished with­out unan­i­mous sup­port. It was pub­lished with 58% sup­port, and not one of the major brows­er ven­dors, not one of the big enter­tain­ment com­pa­nies, signed on to a promise not to sue secu­ri­ty researchers who revealed defects in browsers. 

So let’s talk a lit­tle about secu­ri­ty eco­nom­ics and browsers. So secu­ri­ty, obvi­ous­ly it’s not a bina­ry, it’s a con­tin­u­um. We want to be secure from some attack. You heard some­one talk about threat mod­el­ing ear­li­er. So like, you’ve got a bank vault. You know that giv­en enough time and a plas­ma torch, your adver­sary can cut through that bank vault. But you don’t wor­ry about that because your bank vault is not meant to secure your mon­ey for­ev­er, it’s meant to secure your mon­ey until a secu­ri­ty guard walks by on their patrol and calls the police, right. Your bank vault is inte­grat­ed with the rule of law. It is a tech­ni­cal coun­ter­mea­sure that is back­stopped by the rule of law. And with­out the rule of law, your bank vault will even­tu­al­ly be cut open by some­one with a plas­ma cutter. 

So, secu­ri­ty eco­nom­ics means fac­tor­ing in the expect­ed return on a breach into the design of the sys­tem. If you have a sys­tem that’s pro­tect­ing $500 in assets, you want to make sure that it will cost at least $501 to defeat it. And you assume that you have a ratio­nal actor on the oth­er side who’s not going to come out of your breach one dol­lar in the hole. You assume that they’re not going to be dumdums. 

Graph of Ethereum cryptocurrency value over time showing a sharp spike around March 2017

So, there’s a way that this fre­quent­ly goes wrong, a way that you get con­text shifts that change the secu­ri­ty eco­nom­ics cal­cu­lus. And that’s when the val­ue of the thing that you’re pro­tect­ing sud­den­ly goes up a lot and the secu­ri­ty mea­sures that you’re using to pro­tect it don’t. And all of a sud­den your $501 secu­ri­ty isn’t pro­tect­ing $500 worth of stuff. It turns out that it’s pro­tect­ing $5 mil­lion worth of stuff. And the next thing you know there’s some dude with a plas­ma cut­ter hang­ing around your vault. 

So this chal­lenge is espe­cial­ly keen in the realm of infor­ma­tion secu­ri­ty because infor­ma­tion secu­ri­ty is tied to com­put­ers, and com­put­ers are every­where. And because com­put­ers are becom­ing inte­grat­ed into every facet of our life faster than we can even keep track of it, every day there’s a new val­ue that can be real­ized by an attack­er who finds a defect in com­put­ers that can be wide­ly exploit­ed. And so every day the cost that you should be spend­ing to secure your com­put­ers is going up. And we’re not keep­ing up. In fact, com­put­ers on aver­age are becom­ing less secure because the val­ue that you get when you attack com­put­ers is becom­ing high­er, and so the expect­ed adver­sary behav­ior is get­ting bet­ter resourced and more dedicated. 

So this is where cryp­tocur­ren­cy does in fact start to come into the sto­ry. It used to be that if you found a defect in widely-used con­sumer com­put­ing hard­ware, you could expect to real­ize a few hun­dred or at best a few thou­sand dol­lars. But in a world where intrin­si­cal­ly hard-to-secure com­put­ers are being asked to pro­tect exponentially-growing cryp­tocur­ren­cy pools…well you know how that works, right? You’ve seen cryp­to­jack­ing attacks. You’ve seen all the exchanges go down. You under­stand what hap­pens when the val­ue of the asset being pro­tect­ed shoots up very sud­den­ly. It becomes extreme­ly hard to protect. 

So, you would expect that in that world, where every­thing we do is being pro­tect­ed by com­put­ers that are intrin­si­cal­ly hard to pro­tect and where we need to keep adding more resource to pro­tect them, that states would take as their watch­word mak­ing cryp­to as easy to imple­ment as pos­si­ble; mak­ing secu­ri­ty as easy as pos­si­ble to achieve. But, the reverse is hap­pen­ing. Instead what’s hap­pen­ing is states are start­ing to insist that we’re gonna have to sac­ri­fice some of our secu­ri­ty to achieve oth­er pol­i­cy goals.

So this guy used to be Prime Minister of Australia, he’s not any­more. Wait six months, the cur­rent Prime Minister of Australia will also not be Prime Minister of Australia any­more. This guy, Malcolm Turnbull… Sorry, did I just get his name wrong? I just blew up his name. What is his name? God, he went so quick­ly. Malcolm Turnbull, it is Malcolm Turnbull, it’s right there on the slide. I almost called Malcolm Gladwell. 

So he gave this speech where he was explain­ing why he was going to make it the law that every­body had to back­door their cryp­to for him. And you know, all these cryp­tog­ra­phers had shown up and they said, Well the laws of math say that we can’t do that. We can’t make you a thing that’s secure enough to pro­tect the gov­ern­ment and its secrets, but inse­cure enough that the gov­ern­ment can break into it.” 

And he said…I’m not gonna do the accent. He said, The laws of Australia pre­vail in Australia. I can assure you of that. The laws of math­e­mat­ics are very com­mend­able, but the only law that applies in Australia is,” read it with me, the law of Australia.” I mean… This may be the stu­pid­est tech­nol­o­gy thing ever said in the his­to­ry of real­ly dumb tech­nol­o­gy utterances. 

But he almost got there. And he’s not alone, right. The FBI has joined­him in this call. You know, Canada’s joined him in this call. Like, if you ever need­ed proof that mere­ly hav­ing good pecs and good hair does­n’t qual­i­fy you to have good tech­nol­o­gy pol­i­cy the gov­ern­ment of Justin Trudeau and its tech­nol­o­gy pol­i­cy has demon­strat­ed this for­ev­er. This is an equal oppor­tu­ni­ty mad­ness that every devel­oped state in the world is at least dab­bling in. 

And we have end­ed up not just in a world where fight­ing crime means elim­i­nat­ing good secu­ri­ty. I mean it’s dumb­er than that, right. We’ve end­ed up in a world where mak­ing sure peo­ple watch TV the right way means sac­ri­fic­ing on security. 

Now, the European Union, they just actu­al­ly had a chance to fix this. Because that copy­right direc­tive that the US forced them to pass in 2001 that has the stu­pid rule in it that they bor­rowed from the DMCA, it just came up for its first major revi­sion in sev­en­teen years. The new copy­right direc­tive is cur­rent­ly near­ly final­ized; it’s in its very last stage. And rather than fix­ing this glar­ing prob­lem with secu­ri­ty in the 21st cen­tu­ry, what they did was they added this thing called Article 13

So Article 13 is a rule that says if you oper­ate a plat­form where peo­ple can con­vey a copy­right­ed work to the pub­lic… So like if you have a code repos­i­to­ry, or if you have Twitter, or if you have YouTube, or if you have SoundCloud, or if you have any oth­er way that peo­ple can make a copy­right­ed work avail­able. If you host Minecraft skins, you are required to oper­ate a crowd­sourced data­base of all the copy­right­ed works that peo­ple care to add to it and claim; so any­one can upload any­thing to it and say, This copy­right belongs to me.” And if a user tries to post some­thing that appears in the data­base, you are oblig­ed by law to cen­sor it. And there are no penal­ties for adding things to the data­base that don’t belong to you. You don’t even have to affir­ma­tive­ly iden­ti­fy your­self. And, the com­pa­nies are not allowed to strike you off from that data­base of alleged­ly copy­right­ed works, even if they repeat­ed­ly catch you chaffing the data­base with garbage that does­n’t belong to you—the works of William Shakespeare, all of Wikipedia, the source code for some key piece of blockchain infra­struc­ture which now can’t be post­ed to a WordPress blog and dis­cussed until some­one at Automattic takes their tweez­ers and goes to the data­base and pulls out these garbage entries, where­upon a bot can rein­sert them into the data­base one nanosec­ond later. 

So this is what they did, instead of fix­ing anti-circumvention rules to make the Internet safe for secu­ri­ty. So, I men­tion this is in it’s very last phase of dis­cus­sion, and it looked like it was a fix and then the Italian gov­ern­ment changed over and they flipped posi­tions. And we’re actu­al­ly maybe going to get to kill this, but only if you help. If you’re a European, please go to savey​our​in​ter​net​.eu and sent a let­ter to your MEPs. This is real­ly impor­tant. Because this won’t be fixed for anoth­er sev­en­teen years if this pass­es; savey​our​in​ter​net​.eu.

So, when we ask our­selves why are gov­ern­ments so inca­pable of mak­ing good tech­nol­o­gy pol­i­cy, the stan­dard account says it’s just too com­pli­cat­ed for them to under­stand, right. How could we expect these old, decrepit, irrel­e­vant white dudes to ever fig­ure out how the Internet works, right? If it’s too tech­no­log­i­cal you’re too old, right? 

But sort­ing out com­pli­cat­ed tech­ni­cal ques­tions, that’s what gov­ern­ments do. I mean, I work on the Internet and so I think it’s more com­pli­cat­ed than oth­er peo­ple’s stuff. But you know, when I’m being real­ly rig­or­ous­ly hon­est? I have to admit that it’s not more com­pli­cat­ed than pub­lic health, or san­i­ta­tion, or build­ing roads. And you know, we don’t build roads in a way that is as stu­pid as we have built the Internet.

And that’s because the Internet is much more hot­ly con­test­ed. Because every realm of endeav­or inter­sects with the Internet, and so there are lots of pow­er­ful inter­ests engaged in try­ing to tilt Internet pol­i­cy to their advan­tage. The TV exec­u­tives and media exec­u­tives who pushed for Article 13 you know, they’re not doing it because they’re mustache-twirling vil­lains. They’re just doing it because they want to line their pock­ets and they don’t care what costs that impos­es on the rest of us. Bad tech pol­i­cy, it’s not bad because mak­ing good pol­i­cy is hard. It’s bad because mak­ing bad pol­i­cy has a busi­ness mod­el.

Now, tech did not cause the cor­rup­tion that dis­torts our pol­i­cy out­comes. But it is being super­charged by the same phe­nom­e­non that is dis­tort­ing our pol­i­cy out­comes. And that’s what hap­pened with Ronald Reagan, and Margaret Thatcher, and their cohort who came to pow­er the same year the Apple II Plus shipped. And among the first things they did in office was dis­man­tle our antitrust pro­tec­tions, and allowed com­pa­nies to do all kinds of things that would have been a radioac­tive­ly ille­gal in the decades pre­vi­ous. Like buy­ing all their com­peti­tors. Like engag­ing in ille­gal tying. Like using long-term con­tracts in their sup­ply chain to force their com­peti­tors out. Like doing any one of a host of things that might have land­ed them in front of an antitrust reg­u­la­tor and bro­ken up into small­er pieces the way AT&T had been. 

And as that hap­pened, we end­ed up in a peri­od in which inequal­i­ty mount­ed and mount­ed and mount­ed. And forty years lat­er, we’ve nev­er lived in a more unequal world. We have sur­passed the state of inequal­i­ty of 18th cen­tu­ry France, which for many years was the gold stan­dard for just how unequal a soci­ety can get before peo­ple start chop­ping off oth­er peo­ple’s heads. 

And unequal states are not well-regulated ones. Unequal states are states in which the pec­ca­dil­los, cher­ished illu­sions, and per­son­al pri­or­i­ties of a small num­ber of rich peo­ple who are no smarter than us start to take on out­sized pol­i­cy dimen­sions. Where the pref­er­ences and whims of a few plu­to­crats become law. [applause]

In a plu­toc­ra­cy, pol­i­cy only gets to be evidence-based when it does­n’t piss off a rich per­son. And we can­not afford dis­tort­ed tech­nol­o­gy pol­i­cy. We are at a break­ing point. Our secu­ri­ty and our pri­va­cy and our cen­tral­iza­tion debt is approach­ing rup­ture. We are about to default on all of those debts, and we won’t like what the bank­rupt­cy looks like when that arrives. 

Which brings me back to cryp­tocur­ren­cy and the bub­ble that’s going on around us. The bub­bles, they’re not fueled by peo­ple who have an eth­i­cal inter­est in decen­tral­iza­tion or who wor­ry about over­reach­ing state pow­er. Those bub­bles, right, all the frothy mon­ey that’s in there. Not the coders who are writ­ing it or the prin­ci­pled peo­ple who think about it but all the mon­ey that’s just slosh­ing through it and mak­ing your tokens so volatile that the secu­ri­ty eco­nom­ics are impos­si­ble. That mon­ey is being dri­ven by loot­ers, who are firm­ly entrenched in author­i­tar­i­an states. The same author­i­tar­i­an states that peo­ple are inter­est­ed in decen­tral­iza­tion say we want to get rid of. They’re the ones who are buy­ing cyber weapons to help them spy on their own pop­u­la­tions to fig­ure out who is fer­ment­ing rev­o­lu­tions so they can round them up and tor­ture them and arrest them. So that they can be left to loot their nation­al trea­suries in peace and spin the mon­ey out through finan­cial secre­cy havens like the ones that we learned about in the Panama Papers and the Paradise Papers. 

And abet­ting the oli­garchic accu­mu­la­tion of wealth, that is not gonna cre­ate the kinds of states that pro­duce the sound pol­i­cy that we need to make our browsers secure. It will pro­duce states whose pol­i­cy is a fun­house mir­ror reflec­tion of the worst ideas of the sociopaths who have loot­ed their nation­al wealth and installed them­selves as mod­ern feu­dal lords.

Your cryp­tog­ra­phy will not save you from those states. They will have the pow­er of coer­cive force and the unblink­ing eye of 24‍/‍7 sur­veil­lance con­trac­tors. The Internet, the uni­ver­sal net­work where uni­ver­sal com­put­ing end­points can send and receive cryp­to­graph­i­cal­ly secure mes­sages is not a tool that will save us from coer­cive states, but it is a tool that will give us a tem­po­rary shel­ter with­in them. A space that even the most total­i­tar­i­an of regimes will not be able to imme­di­ate­ly pen­e­trate. Where reform­ers and rev­o­lu­tion­ar­ies can orga­nize, mobi­lize, and fight back. Where we can demand free, fair, and open soci­eties with broadly-shared pros­per­i­ty across enough hands that we can arrive at con­sen­sus­es that reflect best evi­dence and not the whims of a few. Where pow­er is decentralized. 

And inci­den­tal­ly, hav­ing good respon­sive states will not just pro­duce good pol­i­cy when it comes to cryp­to. All of our pol­i­cy fail­ures can be attrib­uted to a small, mon­eyed group of peo­ple who wield out­size pow­er to make their bot­tom line more impor­tant than our shared pros­per­i­ty. Whether that’s the peo­ple who spent years expen­sive­ly sow­ing doubt about whether or not cig­a­rettes would give us can­cer, or the peo­ple who today are assur­ing us that the exis­ten­tial threat that the human species is fac­ing is a con­spir­a­cy among cli­mate sci­en­tists who are only in it for the money. 

So you’re here because you write code. And you may not be inter­est­ed in pol­i­tics, but pol­i­tics is inter­est­ed in you. The rule of law needs to be your alpha and omega. Because after all, all the Constitution is a form of con­sen­sus, right. It’s the orig­i­nal consensus-seeking mech­a­nism. Using the rule of law to defend your tech­nol­o­gy, it’s the most Internet thing in the world. Let’s go back to Bernstein. When we went to Bernstein and argued this case, we essen­tial­ly went on an Internet mes­sage board and made bet­ter argu­ments than the oth­er peo­ple. And we con­vinced the peo­ple who were lis­ten­ing that our argu­ments were right. This is how you folks resolve all of your prob­lems, right? Proof of con­cept. Running code. Good argu­ments. And you win the bat­tle of the day. 

So mak­ing change with words? That’s what every­body does, whether we’re writ­ing code or writ­ing law. And I’m not say­ing you guys need to stop writ­ing code. But you real­ly need to apply your­self to the legal dimen­sion, too. Thank you.


Cory Doctorow: So, we’re gonna ask some ques­tions now. I like to call alter­nate­ly on peo­ple who iden­ti­fy as women or non-binary and peo­ple who iden­ti­fy as male or non-binary, and we can wait a moment if there’s a woman or non-binary per­son wants to come for­ward first. There’s a mic down there and then there’s a rover with a mic. Just stick up your hand. 

Audience 1: As some­one who spent a lot time involved in the Internet, I’m sure you’ve read the book The Sovereign Individual. And I recent­ly read this book and it talked a lot about how the Internet would increase the sov­er­eign­ty of indi­vid­u­als and also how cryp­tocur­ren­cies will. And it pre­dict­ed a mas­sive increase in inequal­i­ty as a direct result of the Internet. Could you com­ment on that?

Doctorow: Yeah I haven’t read the book so I’m not gonna com­ment direct­ly on the book. But I think it’s true that if you view your­self as sep­a­rate from the des­tinies of the peo­ple around you that it will pro­duce inequal­i­ty. I think that that’s like, empir­i­cal­ly wrong, right. Like, if there’s one thing we’ve learned about the lim­its of indi­vid­ual sov­er­eign­ty it’s that you know, you have a shared micro­bial des­tiny. You know, I speak as a per­son who left London in the midst of a measles epi­dem­ic and land­ed in California right after they stamped it out by telling peo­ple that you had to vac­ci­nate your kids or they could­n’t come to school anymore. 

We do have shared des­tinies. We don’t have indi­vid­ual sov­er­eign­ty. And even if you’re the great­est and— You know, any­one who’s ever run a busi­ness knows this, right. You could have a coder who’s a 100X coder, who pro­duces 100 times more lines of code than every­body else in the busi­ness. But if that coder can’t main­tain the prod­uct on their own, and if they’re colos­sal ass­hole that no one else can work with? Then that coder is a lia­bil­i­ty not an asset, right. Because you need to be able to work with more than one per­son in order to attain super­hu­man objec­tives. Which is to say more than one per­son can do. And every­thing inter­est­ing is super­hu­man, right. The lim­its on what an indi­vid­ual can do are pret­ty strong. 

And so yeah, I think that that’s true. I think that the kind of pol­i­cy bent toward self­ish­ness kind of self-evidently pro­duces more self­ish out­comes. But not bet­ter ones, right. Not ones that reflect kind of a shared pros­per­i­ty and growth. Thank you. 

Hi.

Audience 2: Hi. I have had the plea­sure of see­ing you keynote both Decentralized Web Summits, and the ideas you bring to these talks always real­ly stay with me longer than any­thing else.

Doctorow: Thank you.

Audience 2: With what you’ve talked about here, this is hon­est­ly one of the most intim­i­dat­ing and ter­ri­fy­ing top­ics, and I’m won­der­ing what are some ways besides stay­ing informed and try­ing not to get burned out by it all, what are some ways that peo­ple can make a difference?

Doctorow: So, I recent­ly moved back from London to California, as I men­tioned. And one of the things that that means is I have to dri­ve now, and I’m a real­ly shit­ty dri­ver. And in par­tic­u­lar I’m a real­ly shit­ty park­er. So when I have to park, [mim­ing wild steer­ing motions:] I do a lot of this, and then lot of this, and then a lot of this, and a lot of this. And what I’m doing is I’m like…moving as far as I can to gain one inch of avail­able space. And then—or cen­time­ter. And then mov­ing into that cen­time­ter of avail­able space, because that opens up a new space that I can move into. And then I move as far as I can and I open up a new space. 

We do this in com­put­ing all the time, right. We call it hill-climbing. We don’t know how to get from A to Zed. But we do know how to get from A to B, right. We know where the high­er point of what­ev­er it is we’re seek­ing is—stability or den­si­ty or inter­est­ing­ness or what­ev­er. And so we move one step towards our objec­tive. And from there we get a new van­tage point. And it expos­es new avenues of free­dom that we can take. I don’t know how we get from A to Zed. I don’t know how we get to a bet­ter world. And I actu­al­ly believe that because the first casu­al­ty of every bat­tle is the plan of attack, that by the time we’ve fig­ured out the ter­rain, that it would have been oblit­er­at­ed by the adver­saries who don’t want us to go there. 

And so, instead, I think we need heuris­tics. And that heuris­tic is to see where your free­dom of motion is at any moment and take it. Now, Larry Lessig he’s got this frame­work, the four forces: code, law, norms, and mar­kets. My guess is that most of the peo­ple in this room are doing a lot with norms and and mar­kets, right. That’s kind of where this con­fer­ence sits in that lit­tle two-by-two. And as a result you may be blind to some of the law and norm issues that are avail­able to you. That it might be that jump­ing on EFF’s mail­ing list, or if you’re a European get­ting on the EDRi mail­ing list. Or the mail­ing list for the indi­vid­ual dig­i­tal rights groups in your own coun­tries like Netzpolitik in Germany, or the Quadrature du Net in France, or Open Rights Group in the UK, or Bits of Freedom in the Netherlands and so on. 

Getting on those lists and at the right moment call­ing your MEP, call­ing your MP, or even bet­ter yet like, actu­al­ly going down when they’re hold­ing surg­eries, when they’re hold­ing con­stituen­cy meet­ings. They don’t hear from a lot of peo­ple who are tech­no­log­i­cal­ly clued-in. Like, they only get the oth­er side of this. And you know, I’ve been in a lot of these pol­i­cy forums, and often­times the way that the oth­er side pre­vails is just by mak­ing it up, right. Like one of the things we saw in this fil­ter debate like, we had com­put­er sci­en­tists who were telling MEPs… You know, the sev­en­ty most emi­nent com­put­er sci­en­tists in the world, right, a bunch of Turing Prize win­ners. Vint Cerf and Tim Berners-Lee said like, These fil­ters don’t exist and we don’t know how to make em.” And they were like, Oh, we’ve got these oth­er experts who say we know how to do it.’ ” And they had been told for years that the only rea­son nerds had­n’t built those fil­ters is they weren’t nerd­ing hard enough, right. 

And if they actu­al­ly hear from their own con­stituents, peo­ple who run small busi­ness that are part of this big frothy indus­try that every­body wants, their nation­al economies to par­tic­i­pate in. Who show up at their law­mak­ers’ offices and say, This real­ly is cat­a­stroph­ic. It’s cat­a­stroph­ic to my busi­ness. It’s cat­a­stroph­ic to the Internet,” they lis­ten to that. It moves the needle.

And you know, you heard ear­li­er some­one say are we at pitch now? Well, I should pitch, right? I work for Electronic Frontier Foundation. We’re a non­prof­it. The major­i­ty of our mon­ey comes from indi­vid­ual donors. It’s why we can pur­sue issues that are not nec­es­sar­i­ly on the radar of the big foun­da­tions or big cor­po­rate donors. We’re not behold­en to any­one. And it’s peo­ple like you, right, who keep us in busi­ness. And I don’t draw mon­ey from EFF. I’m an MIT Media Lab research affil­i­ate and they give EFF a grant that pays for my work. So the mon­ey you give to EFF does­n’t land in my pock­et. But I’ve been involved with them now for fif­teen years and I’ve nev­er seen an orga­ni­za­tion squeeze a dol­lar more. So, real­ly think it’s worth your while; eff​.org. Thank you.

Oh. Someone over here. Yes, hi.

Audience 3: Thank you very much. Really appre­ci­ate the speech. It was very inspiring.

Doctorow: Thank you.

Audience 3: Um, I think…maybe not sure how many oth­er peo­ple feel this way, but one thing that’s been hard to me about pol­i­tics in gen­er­al, espe­cial­ly in the age of social media is, you know…there’s a lot of it that spreads mes­sages of fear and anger and hatred. And some­times it feels like when you want to say some­thing and you want to spread a cer­tain voice or just spread a cer­tain mes­sage, that there’s this fear of get­ting swept up in all these mes­sages and ideas and things that aren’t nec­es­sar­i­ly… You’re not nec­es­sar­i­ly aware of your own bias­es and things like that. How does one stay sane, and fight for you know, the right fight?

Doctorow: God. I you know, I wish I knew. I like— I’ll freely admit to you I’ve had more sleep­less nights in the last two years than in all the years before it. I mean, even dur­ing the move­ment to end nuclear pro­lif­er­a­tion that I was a big part of in the 80s when I thought we were all going to die in a mush­room cloud, I was­n’t as wor­ried as I am now. It’s tough. 

I mean, for me like just in terms of like, personal…psychological opsec? I’ve turned off every­thing that non-consensually shoves Donald Trump head­lines into my eye­balls? You know, that we talk a lot about how like, engage­ment met­rics dis­tort the way appli­ca­tions are designed. But you know, I real­ly came to under­stand that that was hap­pen­ing about a year and a half ago. So for exam­ple they changed the default Android search bar? so that when you tapped in it it showed you trending…searches. Well, like, nobody has ever gone to a search engine to find out what oth­er peo­ple are search­ing for, right? And the trend­ing search­es were inevitably Trump threat­ens nuclear armaged­don.” So the last thing I would do before walk­ing my daugh­ter to school every morn­ing is I would go to the weath­er app. And I would tap in it to see the weath­er. And it’s weath­er and head­lines. And the only head­lines you can’t turn off are top head­lines, and they’re trend—you know, they’re all Trump Threatens Nuclear Armageddon,” right?

So I real­ized after a month of this that what had been real­ly the most calm­ing, ground­ing fif­teen min­utes of my day where I would walk with my daugh­ter to school, and we’d talked about stuff and it was real­ly quiet—we live on a leafy street… I’d just spend that whole time wor­ry­ing about dying, right?

And so, I had to fig­ure out how to like go through and turn all that stuff off. Now what I do is I block out times to think about head­lines. So I go and I look at the news for a cou­ple hours every day…and I write about it. I write Boing Boing, right. I write a blog about it. Not nec­es­sar­i­ly because my opin­ions are such great opin­ions. But because being syn­thet­ic and thought­ful about it means that it’s not just…buf­fet­ing me, right? It becomes a reflec­tive rather than a reflex­ive exercise. 

But I don’t know, right? I mean, I think that— And I don’t think it’s just the tech. I think we are liv­ing in a moment of great psy­chic trau­ma. We are liv­ing in a— You know. The rea­son the IPCC report was ter­ri­fy­ing was not because of the shrill head­lines. The IPCC report was ter­ri­fy­ing because it is objec­tive­ly ter­ri­fy­ing, right.

And so, how do you make things that’re… I don’t know how you make things that’re objec­tive­ly ter­ri­fy­ing not ter­ri­fy­ing. I think the best we can hope for is to oper­ate, while we are ter­ri­fied, with as much calm and aplomb and thought­ful­ness as is possible.

How are we for time, do you want me off? I know my clock­’s run out. Or can I take one more ques­tion? Stage man­ag­er? One more or… One more. Alright. And then we’ll ring us off.

Audience 4: Yeahhh!. Hi.

Doctorow: Better be good, though.

Audience 4: Okay, I’m ready. I work for the Media Lab, too. So, my ques­tion Cory—thank you for your talk. I think a lot of peo­ple in the cryp­tocur­ren­cy world think about the cur­rent sys­tems that we exist in. And we’re try­ing to exit those sys­tems to some extent and cre­ate parallel…financial, you know, polit­i­cal insti­tu­tions, what have you, ver­sus express­ing voice with­in the cur­rent sys­tem. How do you bal­ance exit ver­sus voice in the cur­rent system?

Doctorow: Well… You know, in a tech­nol— And I said before that like, a Constitutional argu­ment is just an Internet flame war by anoth­er means, right? So, when you’re argu­ing about a com­mit and a pull request, one of the things you do is you do proof of con­cept, right? You show that the thing that you’re patch­ing is real and can be exploit­ed. Or you show that you you’ve unit tests to show that your patch per­forms well.

Those par­al­lel exer­cis­es are use­ful as proof of con­cepts and as unit tests, right? They’re pro­to­types that we can fold back into a wider world. And I think that… The thing I wor­ry about is not that tech­nol­o­gists will build tech­nol­o­gy. I want tech­nol­o­gists to build tech­nol­o­gy. It’s that they will think that the job stops when you’ve built the proof of con­cept. That’s where the job starts, right? When you can prove that you’ve writ­ten a bet­ter algo­rithm, you then have to con­vince the oth­er stake­hold­ers in the project that it’s worth the actu­al like non-zero cost of patch­ing to make that work, right? Of going through the whole source tree and find­ing all the depen­den­cies on the things that you’re pulling out and grace­ful­ly replac­ing them. Because you know, when you run a big data cen­ter you can’t just start patch­ing stuff…you’ve got a tool­chain that you have to pre­serve, right?

And so that’s where the job starts, right? Build your proof of con­cept, build us a par­al­lel finan­cial sys­tem, build us a what­ev­er…so that we can fig­ure out how to inte­grate it into a much wider, more plu­ral­is­tic world. Not so that we can sep­a­rate and seast­ead on our little…you know, world over there. Like, it does­n’t mat­ter how great your— [applause] Thank you. Doesn’t mat­ter how great your bunker is, right? Like you can’t shoot germs, right? Like if your solu­tion allows the rest of the world to fall into chaos, and no one’s tak­ing care of the san­i­ta­tion sys­tem, you will still shit your­self to death of cholera, in your bunker, because like, you can’t shoot germs, right? So we need plu­ral­is­tic solu­tions that work for all of us.

Audience 4: Thank you.

Doctorow: Thank you.

Alright. Thanks everyone.

Further Reference

Devcon IV homepage