So, academia’s char­ac­ter­ized by argu­ment, by cut and thrust, by dis­agree­ment. But there’s a very spe­cial kind of dis­agree­ment that’s a kind of dis­agree­ment that’s hard to make pro­gress from, and That’s denial­ism. This is man­u­fac­tured con­tro­ver­sy from peo­ple who ben­e­fit from mak­ing it seem like there’s con­tro­ver­sy about some­thing for which there is no actu­al con­tro­ver­sy among prac­ti­tion­ers.

The canon­i­cal exam­ple of this, of course, is smok­ing. The can­cer denial move­ment, through which high-paid con­sul­tants to the tobac­co indus­try spent decades cast­ing doubt over whether or not there was a causal link between lung can­cer and tobac­co was fan­tas­ti­cal­ly prof­itable and became the tem­plate for all of the denial move­ments that fol­lowed.

The next move­ment that real­ly caught fire was of course the AIDS denial move­ment, which had incred­i­bly grave con­se­quences, con­se­quences that prob­a­bly rival can­cer denial for the kinds of fall­out that we expe­ri­ence from it. One of the loci of this is a guy named Matthias Rath, who was a German doc­tor who ran a very prof­itable vit­a­m­in busi­ness in the EU. And he claimed that AIDS wasn’t caused by HIV, rather by a vit­a­m­in defi­cien­cy that could be treat­ed with his prod­ucts. So he ran the­se full-page ads that said why should South Africans be poi­soned with AZT?” all through South Africa, to encour­age peo­ple not to take anti­retro­vi­rals and instead to take vit­a­mins and nos­trums.

And as if this wasn’t bad enough, he had the ear of the out of the pres­i­dent, Thabo Mbeki. And through him and through the health min­is­ter, they effec­tive­ly used vit­a­mins instead of anti­retro­vi­rals for sev­er­al years to treat HIV, which lead to as many as three hun­dred thou­sand peo­ple dying. And dur­ing this peri­od, the pro­por­tion of South Africans who are HIV-positive rose from about 1% to about 25%.

And of course one of the ele­ments of denial is that when peo­ple call you out on denial, you have to be able to silence them. This is where the for­bid­den research came in. When Ben Goldacre, an epi­demi­ol­o­gist and sci­en­tist pub­lished on this in the Guardian news­pa­per, Rath sued the guardian and they spent about three hun­dred thou­sand pounds win­ning the right to pub­lish their sto­ry about the fall­out from AIDS denial.

AIDS denial begat cli­mate denial, which is alive and well today, the non-controversy about anthro­pogenic cli­mate change. And then the kind of denial that’s may­be par­tic­u­lar­ly rel­e­vant to an MIT audi­ence, and the kind of denial I’m going to talk about most­ly today, is Turing-completeness denial. 

So, we only real­ly know how to make one kind of com­put­er. That’s the com­put­er that can run all the pro­grams that we can express sym­bol­i­cal­ly. But for lots of rea­sons, peo­ple would like it to be pos­si­ble to make com­put­ers that can only run pro­grams that don’t make you sad. That would be great. It would be awe­some if we could make print­ers that couldn’t also be vec­tors for mal­ware. But the rea­son we haven’t done that isn’t because the nerds are refus­ing to coop­er­ate with the forces of right and jus­tice. It’s because this is the com­put­er we know how to make.

One of the canon­i­cal exam­ples of Turing-completeness denial is dig­i­tal rights man­age­ment, this idea that if you want to stop peo­ple from run­ning pro­grams that make copies of files that you wouldn’t like them to copy on their com­put­er, you can encrypt the file and send it to them, and also send them the key, but ask their com­put­er not to let them know what the key is. The tech­ni­cal term for this in secu­ri­ty cir­cles is wish­ful think­ing.” We don’t keep even real­ly good safes in bank rob­bers’ liv­ing rooms. Not because safes don’t work but because you can’t give your adver­sary the key and then hope that they can’t fig­ure out where you hid it. Especially when your adver­sary might be a grad stu­dent with noth­ing to do this week­end and a bunch of under­grads hang­ing around like a bad smell, in a lab with an electro-tunneling micro­scope that’s going idle.

But of course, if Turing denial isn’t just about DRM, one of the most vir­u­lent man­i­fes­ta­tions of it in the last two years has been cryp­tog­ra­phy denial, some­thing that we had put behind us in the 1990s dur­ing the Clipper chip debate, but which has resur­faced. This is the idea that we can make cryp­tog­ra­phy that works per­fect­ly well except when it needs to cat­a­stroph­i­cal­ly fail, at which point it will cat­a­stroph­i­cal­ly fail but con­tin­ue to work for all the peo­ple it shouldn’t cat­a­stroph­i­cal­ly fail for. 

And of course the oth­er kind of Turing-completeness denial that we have is pri­va­cy denial, the idea that if you have noth­ing to hide you have noth­ing to fear. That secre­cy is the same as pri­va­cy. That because I know what you do when you go into the toi­let it shouldn’t be your right to close the door. That kind of denial has real­ly caught fire in the last fif­teen years, twen­ty years of sur­veil­lance cap­i­tal­ism ris­ing to its cur­rent pre­em­i­nence.

And the thing about denial is it begets nihilism. Denial mat­ters because the things that are being denied (the poten­tial harms of pri­va­cy, anthro­pogenic cli­mate change, AIDS, can­cer) those things are real. And the non-solutions that arise when you deny them, those non-solutions don’t solve the­se prob­lems that are real and get­ting worse because they’re not being addressed through our pol­i­cy because we can’t address them because we’re in denial about them.

So for many years peo­ple, who were wor­ried about the risk of can­cer smoked light cig­a­rettes as though there was a kind of cig­a­ret­te that didn’t give you can­cer. It’s not true. In case you were won­der­ing, there isn’t a kind of cig­a­ret­te that doesn’t give you can­cer. There were peo­ple who went on hav­ing unpro­tect­ed sex with­out tak­ing anti­retro­vi­rals, because they’d said that AIDS was not a sex­u­al­ly trans­mis­si­ble ill­ness but rather a vit­a­m­in defi­cien­cy. So if it’s a vit­a­m­in defi­cien­cy then it doesn’t mat­ter if you con­tin­ue hav­ing unpro­tect­ed sex. Recall South Africa’s rate of infec­tion went from 1% to 25% dur­ing its peri­od of offi­cial denial. Or you know, we insist that there are ways that we can build on flood­plains or con­tin­ue to emit a lot of car­bon, and that this some­how won’t cause lots of cat­a­stroph­ic prob­lems down the way.

In the realm of DRM, we insist that the rea­son that artists aren’t get­ting their share of the income that’s being gen­er­at­ed for their works isn’t that they have bad rela­tion­ships with the firms that mon­e­tize their work, it’s that their audi­ence has failed to watch their TV shows in the right way, or lis­ten to their music in the right way. And we insist that some­where out there is a tool that will force peo­ple to lis­ten to music in the right way, and that that will some­how put mon­ey in the pock­ets of artists as opposed to, for exam­ple, orga­niz­ing to insist that they get bet­ter con­trac­tu­al arrange­ments with their pub­lish­ers, labels, and stu­dios.

Or in the realm of cryp­to denial, we build out infra­struc­ture that has known holes in it, that we have put holes in, so that peo­ple can access it law­ful­ly, so that the back­doors can be used by law enforce­ment. And then this stuff which is then out in the field can’t ever be reme­di­at­ed because it’s sit­ting in the­se remote loca­tions where we have a hard time patch­ing it. You may have seen that yes­ter­day CERT pub­lished an advi­so­ry on vul­ner­a­bil­i­ties in base­band radios in mobile devices. There are lit­er­al­ly bil­lions of the­se devices. They’re in the field. We will nev­er ever patch all of them. If you can devel­op an attack again­st those base­band radios, you can bypass the oper­at­ing sys­tem of the mobile device to access its data and implant mal­ware on it. Even hard­ened devices, devices that use strong cryp­tog­ra­phy, devices that are keyed to antic­i­pate attack. They are unpro­tect­ed on the base­band side.

And we encour­age the for­ma­tion of busi­ness­es based on siphon­ing off ever-larger caches of our sen­si­tive infor­ma­tion on more and more insane and improb­a­ble bets that some­day we’ll fig­ure out how to turn all of that into giant amounts of mon­ey. It’s the old lady who swal­lowed the fly prob­lem, right? Once you accept that we need to solve this prob­lem by smok­ing lighter cig­a­rettes, by tak­ing more vit­a­mins, then it begets anoth­er prob­lem. You must not be tak­ing the right vit­a­mins. You must not be smok­ing light enough cig­a­rettes. You you must not be try­ing hard enough to lock down hard­ware so that users can’t recon­fig­ure it. 

So, the prob­lem is still there. The solu­tion hasn’t worked. And the denial move­ment won’t admit it, because to admit it would be to admit that they were wrong. Instead we pass a law that says dis­clos­ing vul­ner­a­bil­i­ties in DRM is a felony pun­ish­able by five years in pris­on and a five hundred-thousand dol­lar fine. Because although we know that the DRM can be bro­ken, we assume that we can just silence the peo­ple who dis­cov­er those flaws. And this makes things a lot worse, right? If you’re not allowed to tell peo­ple about flaws in sys­tems that they rely on, it doesn’t mean that those flaws won’t get weaponized and used again­st them. It just means that they’ll nev­er know about it until it’s too late. Everyone should have the absolute right to know whether or not the tech­nol­o­gy they rely on is work­ing.

And we also cre­ate the­se gigan­tic terms of ser­vice that say that pri­va­cy isn’t a prob­lem because by stand­ing in the vicin­i­ty of a thing that’s siphon­ing off your per­son­al infor­ma­tion, you’ve agreed that it’s allowed to take all your infor­ma­tion and also wear your under­wear and make long dis­tance calls and eat all the food in your fridge and punch your grand­moth­er. And there­fore there is no pri­va­cy prob­lem, because you’ve agreed that there is no prob­lem.

So we spend more mon­ey, we take more mea­sures, we waste more of everyone’s time, and then we end up with it’s start­ing to feel like it’s too much trou­ble to even both­er with. It’s a fact of life. Sure, cig­a­rettes are gonna kill me some­day, but what the hell, it’s too late now. There’s already so much car­bon in the atmos­phere, why should we stop dri­ving? The enter­tain­ment indus­try is going to insist on dig­i­tal rights man­age­ment no mat­ter what we do; why shouldn’t we just accom­mo­date them and put it in all of our tech­nol­o­gy? I’m gonna leak my data no mat­ter what, so I might as well join Facebook and get invit­ed to some par­ties on the way to the infor­ma­tion apoc­a­lypse.

And so it cre­ates this idea that there’s no future. That you might as well just give up. But there is an alter­na­tive. Because at a cer­tain point, no mat­ter how much denial and FUD there is, the prob­lem becomes unde­ni­able, right? Even though we can’t agree on the cause, we can agree that there is a prob­lem. So with pri­va­cy, for exam­ple, the US gov­ern­ment says that the Computer Fraud and Abuse Act makes vio­lat­ing terms of ser­vice a felony. And as Ethan just described, this means that we can’t inves­ti­gate in depth how ser­vices gath­er infor­ma­tion and use it, because in order to do so we have to vio­late their terms of ser­vice. And since the terms of ser­vice have the pow­er of law, we risk going to jail just to find out what’s going on. 

Or the Digital Millennium Copyright Act, the 1998 statute that has lots and lots of claus­es, but Section 1201 makes it a crime accord­ing to the US gov­ern­ment to inves­ti­gate sys­tems that have dig­i­tal rights man­age­ment in them and to divul­ge their flaws, to divul­ge their work­ings.

Digital rights man­age­ment thus becomes a kind of attrac­tive nui­sance. Because once you add a skin of dig­i­tal rights man­age­ment around your tech­nol­o­gy, you can sue any­one who breaks it even for law­ful pur­pos­es. Which means that it ends up metas­ta­siz­ing into all kinds of things. We have light bulbs with dig­i­tal rights man­age­ment. And Philips, who make the Hue pro­duct, last year briefly intro­duced a firmware update that caused their light sock­ets to reject non-Philips light bulbs. And since you had to bypass their DRM in order to reme­di­ate this, it became a felony to plug a light bulb of your choos­ing into your light sock­et, briefly, (or a poten­tial felony) until Phillips, cowed by the out­rage, rolled that back. But who, faced with this oppor­tu­ni­ty to add restric­tion to their tech­nol­o­gy that then allows them to dic­tate how peo­ple can use it, to make it as prof­itable as pos­si­ble for them, who wouldn’t choose to adopt DRM? What what indus­try wouldn’t thank the gov­ern­ment for that gift and take it on board?

So now we see dig­i­tal rights man­age­ment in tech­nol­o­gy as diverse as pace­mak­ers, ther­mostats, cars, med­ical diag­nos­tic equip­ment, baby mon­i­tors, insulin pumps, cat lit­ter box­es, smart light bulbs, and the­se babies, the Internet of Things rec­tal ther­mome­ter. You lit­er­al­ly have dig­i­tal rights man­age­ment up the ass at this point. 

The pri­va­cy and secu­ri­ty impli­ca­tions of all of the­se devices being off-limits to inves­ti­ga­tion, to secu­ri­ty audit­ing, and to dis­clo­sure… That’s fig­u­ra­tive­ly ther­monu­clear, but it’s lit­er­al­ly poten­tial­ly lethal for you not to be able to know how the­se sys­tems are work­ing and whether or not they have flaws in them.

Now, at a cer­tain moment, because the­se prob­lems become so vis­i­ble to us, we hit a kind of moment of peak indif­fer­ence. The moment when the peo­ple who care about this stuff, the num­ber of peo­ple who care about it, is nev­er going to go down. That’s not the moment at which the tide changes in the pol­i­cy debate, but it’s the moment at which the activist tac­tic changes. Because although your job may have been for twen­ty years to con­vince peo­ple that this stuff mat­tered, all of a sud­den your job becomes con­vinc­ing peo­ple that there’s some­thing that they can do about it, because peo­ple have agreed that this stuff mat­ters.

So, when enough of us have watched a loved one die of lung can­cer, when cli­mate refugees can no longer be ignored because they’re lit­er­al­ly wash­ing up on your shore, when data breach­es destroy the lives of mil­lions of peo­ple every week. For exam­ple, the Office of Personnel Management leaked over twen­ty mil­lion records of peo­ple who had applied for secu­ri­ty clear­ance in the United States a lit­tle over a year ago. Suddenly the pri­va­cy debate in those cir­cles changed. 

I went to this Rand war game exer­cise about infor­ma­tion breach­es, and all of the cops and spooks in the room, every time some­one pro­posed a solu­tion that involved allow­ing lots of infor­ma­tion into the pub­lic domain and allow­ing it to be han­dled by unvet­ted par­ties, they reject­ed those solu­tions out of hand as though they were com­plete­ly unfit for pur­pose. And I couldn’t fig­ure out why until one of them said Office of Personnel Management” and I said, Oh yeah, right. You had to sit down with a gov­ern­ment offi­cial and tell them every­thing that could be used to black­mail you as a con­di­tion of your secu­ri­ty clear­ance. So they know about your mom’s sui­cide attempt and the fact that your broth­er is in the clos­et and the fact that you’re HIV pos­i­tive and you haven’t dis­closed it your cowork­ers. And all of that infor­ma­tion was breached prob­a­bly to the Chinese gov­ern­ment last year. Of course you now care about pri­va­cy.”

And they’re not the only ones. It’s not just peo­ple whose data gets leaked this way. People have their devices breached by voyeurs who spy on them. You may have heard of remote access tro­jan, or RATing. This is when you break into someone’s lap­top, spy on them using their lap­top cam­era, cap­ture images of their inci­den­tal nudi­ty along with their key strokes when they key in their pass­words for social media. And then you com­bine those two things. I will pub­lish the­se inci­den­tal nude images on your social media chan­nels,” to black­mail them into per­form­ing live sex acts on cam­era.

When RATers get arrest­ed, they don’t just have one or two vic­tims. They often have hun­dreds. The FBI raid­ed a hun­dred RATers last year. The most pro­lific had four hun­dred victims—many of them are under­age people—all over the world. There is a very pub­li­cized case in Canada where a young wom­an com­mit­ted sui­cide after being hound­ed by a RATer. She was a teenager and the RATer was releas­ing her infor­ma­tion into her social media chan­nel, where her fel­low stu­dents could see it, which led to her being bul­lied and killing her­self.

So at a cer­tain point, peo­ple are find them­selves unable to ignore the­se prob­lems any­more. They find that their cars are being hijacked by net­worked attacks, the dis­clo­sure of which can lead to felony pros­e­cu­tion. Or that enough infor­ma­tion has been pieced togeth­er through data breach­es to get a dupli­cate deed for their house, and then sell their house out from under them while they’re out of town. This hap­pened both in New York and London in the run-up to Christmas last year.

When peo­ple real­ize that it’s a felony to recon­fig­ure their devices to do what they want or to give them max­i­mum val­ue— Farmers are all up in arms about John Deere, which uses dig­i­tal rights man­age­ment to lock up its diag­nos­tic infor­ma­tion on the trac­tor, includ­ing the infor­ma­tion that’s gen­er­at­ed when you dri­ve your trac­tor around the back forty and col­lect the soil den­si­ty data at a fine degree of res­o­lu­tion, which you could then use to broad­cast your seed auto­mat­i­cal­ly. Except John Deere won’t give you that infor­ma­tion. They sell it to you along with a bundle of seed from part­ners like Monsanto. And to remove that infor­ma­tion from the trac­tor on your own with­out their say so risks DMCA pros­e­cu­tion. And so farmer mag­a­zi­nes are now wor­ried about the DMCA and about DRM.

So, at that moment when every­body is sud­den­ly car­ing about this stuff, that’s the moment at which nihilism can be avert­ed. It’s the moment in which nihilism must be avert­ed if you’re going to make a change. Peak indif­fer­ence is the moment when you stop con­vinc­ing peo­ple to care about an issue, and start con­vinc­ing them to do some­thing about it. To quit smok­ing, to call for emis­sions reduc­tion, to install cryp­to on their devices, to jail­break every­thing. It’s the moment when you tell them the names of the peo­ple who per­son­al­ly ben­e­fit­ed from their immis­er­a­tion, and you tell them where they live. The peo­ple who delib­er­ate­ly cre­at­ed this false con­tro­ver­sy that made it impos­si­ble to effec­tive­ly address the­se prob­lems. That’s the moment when if you catch it you can move peo­ple from indif­fer­ence to mak­ing a dif­fer­ence.

But you need prin­ci­ples if you’re going to make it hap­pen, as the esteemed com­put­er sci­en­tist Alexander Hamilton once said, If you stand for noth­ing, what will you fall for?” Just because some rules are bad, it doesn’t fol­low that rules them­selves are bad. You need to have prin­ci­ples that guide your work and a way to defend them again­st every­one, includ­ing future ver­sions of your­self, who might some­day weak­en or waiver and your com­mit­ment to those prin­ci­ples, and a way to keep those prin­ci­ples up to date.

So, we have a real­ly good exam­ple of this in our com­mu­ni­ty. It’s the GNU Linux licens­ing regime, the GPL, which I’m sure you’re all famil­iar with. The free soft­ware move­ment has the­se prin­ci­ples, this prin­ci­ple that com­put­ers should serve peo­ple rather than enslav­ing them. And it has as a way of imple­ment­ing them the­se 3 ideas (Actually four ideas. There’s a zero in there.) That you should be able to run code. That you should be able to under­stand your code. That you should be able to improve your code. And that you should be able to share what you’ve learned in improv­ing your code with oth­er peo­ple.

And it has a tac­tic for mak­ing that stick, the GPL, their soft­ware license. Once you license code under the GPL, there’s no back­sies. You can’t revoke that license. Which means that if you start your busi­ness full of high-minded ide­als about how you’re going to change the world by open­ing up com­put­ers and soft­ware, that no mat­ter how des­per­ate things become; no mat­ter how many people’s mort­gages are on the line because you can’t make pay­roll; no mat­ter how your acqui­si­tion suit­or wants you to change things; no mat­ter how angry your investors get at you, you can nev­er de-GPL your code. And in fact, if you make it known that your code can nev­er be GPL’d, there’s a good chance no one’s ever going to ask you to de-GPL it. It actu­al­ly changes the char­ac­ter­is­tics of what peo­ple pres­sur­ize you to do. 

There’s a name for this. It’s called the Ulysses pact.” This is named after Ulysses, who lashed him­self to the mast when he was going into siren-infested waters, to make sure that when the siren sang he wouldn’t jump into the sea. When you license your code under the GPL on day one, you’re doing some­thing equiv­a­lent to throw­ing away your Oreos on the day one of your diet. You do that not because you’re weak-willed and won’t be able to resist the siren song of Oreos, but because you are strong-willed enough to know that there will come a day when your will will waver. And so you bind your­self now to not tak­ing a bad course of action in the future.

Now, the ear­ly pio­neers of the Internet want­ed to build some­thing decen­tral­ized, open, and free. But as we’ve learned, we end­ed up build­ing history’s biggest sur­veil­lance device, so advanced that gov­ern­ments fac­ing social unrest some­times leave the Internet on instead of turn­ing it off, because the best way to con­trol a pop­u­la­tion in revolt is to know exact­ly what they’re doing through their devices

And no one is the vil­lain of their own sto­ry. The net pio­neers who made the com­pro­mis­es that made the Internet what it is today, they instead of decid­ing to sell out, made a tiny com­pro­mise. And because we’re only real­ly capa­ble of detect­ing rel­a­tive dif­fer­ences, they made anoth­er lit­tle com­pro­mise, and anoth­er lit­tle com­pro­mise, each one of which felt very small, but we end­ed up where we are today.

And there’s this great project under way today to re-decentralize the Internet. But if we’re going to do that we also need to fig­ure out how to pre­vent it from de-re-decentralizing. We need rules to guard us from our future selves and the moments of weak­ness that we’ll have. The rules that we make today when we’re pirates to guard us from the admi­rals that some of us will inevitably become.

So I pro­pose a cou­ple of rules. The first one is that com­put­ers should always obey their own­ers. When a com­put­er gets a mes­sage from a remote par­ty and that con­tra­dicts what the per­son who owns the com­put­er wants it to do, the per­son should always win. And the sec­ond one is that true facts about com­put­ers should always be legal to dis­close, espe­cial­ly about their secu­ri­ty vul­ner­a­bil­i­ties.

So how do we make those stick? Well, we can build them into our license terms and con­di­tions of mem­ber­ship into our con­sor­tia. We can make them con­di­tions of reg­u­la­to­ry approval. We can say that if the FDA’s going to bless your med­ical implant, they have to bind the com­pa­ny that makes it not to invoke laws that pre­vent dis­clo­sure or own­er over­ride over their devices. We can incor­po­rate them into the def­i­n­i­tion of open stan­dards.

Your own rule-breaking needs to have prin­ci­ples like the­se, the­se sim­ple min­i­mum viable abroad agree­ments. The rules for rule-breaking. The prin­ci­ples so hard­line that they call you an extrem­ist. In fact, if they’re not call­ing you an extrem­ist, you’re prob­a­bly not doing it right. And you will need Ulysses pacts. You’ll need tools to stop you from becom­ing com­pro­mised when you get old and tired.

The werewolf’s sin is not turn­ing into a were­wolf, it’s fail­ing to lock him­self in the base­ment when the full moon comes. Your trick will not be to stay pure. Your trick will be to antic­i­pate the moments of weak­ness in the future and to make sure that you can guard your­self again­st them.

I think Kit has an announce­ment to make. So, as I was com­ing up, I was won­der­ing whether or not I’d be able to say this, but I think Kit can say this. 

Kit Walsh: I’m Kit Walsh. I’m a staff attor­ney at EFF. And I would like to announce that one of the laws that Cory men­tioned, Section 1201 of the Digital Millennium Copyright Act, which pre­vents secu­ri­ty research, it pre­vents you from access­ing media in order to remix it, recast it— That we have prin­ci­ples that gov­ern what sorts of rules we have, and they’re in the Constitution. And this morn­ing, we filed a case again­st the gov­ern­ment, chal­leng­ing the con­sti­tu­tion­al­i­ty of Section 1201 of the DMCA as con­trary to the First Amendment. We brought that case on behalf of secu­ri­ty researcher Matt Green and tech­nol­o­gist Bunnie Huang and his com­pa­ny AlphaMax, but real­ly on behalf of the entire pub­lic and every­one who wants to make law­ful uses of copy­right­ed works for research and expres­sion. Thanks.

Doctorow: Thank you. So, that’s our next sev­er­al years tak­en care of. We’re going to be work­ing on chang­ing the law. And not just chang­ing it here, but because all of the coun­tries around the world have been arm-twisted into adopt­ing their own ver­sions of Section 1201 of the DMCA, things will be ripe for our allies and col­leagues around the world to think about revok­ing it there, too. EFF has a project called the Apollo 1201 Project, whose goal is to end all DRM in the world with­in a decade, and this is our open­ing salvo. Thank you all very much.


Help Support Open Transcripts

If you found this useful or interesting, please consider supporting the project monthly at Patreon or once via Square Cash, or even just sharing the link. Thanks.