Good morn­ing, every­body. Thank you so much for being here. And just before I get into talk­ing about the con­fer­ence and the log­ic of today, just a lit­tle bit on ground rules. This con­fer­ence is entire­ly pub­lic. It’s entire­ly on the record. You are already as we speak under video sur­veil­lance by robot­ic cam­eras in all cor­ners, and this is so that we’re able to livestream this. But we absolute­ly encour­age you to blog, to tweet, to share in what­ev­er way you feel like shar­ing. I sus­pect there are Pokémon to cap­ture some­where in this build­ing, but if you do, make sure that you’re talk­ing about the for­bid­den nature of it at the same time. The hash­tag for the events is #for­bid­den­ml. And so we’d love it if you would jump on and use that tag. 

Joi men­tioned a lit­tle bit about how this event came about. For me, this event real­ly start­ed with a stu­dent mine Jeremy Rubin. And Jeremy, in late 2013, was part of a team that put togeth­er a new lit­tle tech­nol­o­gy called Tidbit. Tidbit was basi­cal­ly a sys­tem that instead of forc­ing you to look at an ad when you read a web page, grabbed your brows­er and had you mine bit­coins. And if you hap­pened to turn up any bit­coins, they went to the peo­ple who had cre­at­ed the content. 

And we thought this was sort of a clever idea, a proof of con­cept. But it turned out that the New Jersey Attorney General real­ly did not like this idea, and respond­ed to this idea with a sub­poe­na. And as a result, Jeremy found him­self not work­ing on inno­v­a­tive research but work­ing on a pret­ty inno­v­a­tive legal defense. And many many thanks to our friends at EFF who made that possible.

But we found, look­ing at this, that this raised real ques­tions about what we as MIT, as an aca­d­e­m­ic insti­tu­tion, do when stu­dents and mem­bers of our com­mu­ni­ty find them­selves in trou­ble because of research that they’ve tak­en on. And this isn’t just about Jeremy Rubin. This is about Star Simpson. This is very much about Aaron Swartz. This is about peo­ple whose inno­va­tions find them­selves push­ing the lim­its and bump­ing up against legal issues, dif­fer­ent ways of pre­vent­ing peo­ple from answer­ing ques­tions that’re deeply impor­tant to ask.

And so we threw a con­fer­ence last fall called Freedom to Innovate. We were try­ing sort of an affir­ma­tive tack on this. How would we actu­al­ly go ahead and find ways to pro­tect this free­dom, this free­dom to take on nov­el new research, and to try to fig­ure out how to pro­tect our­selves from legal bar­ri­ers to it. But as we entered into that con­ver­sa­tion, we found our­selves real­iz­ing that it’s not just the legal side of things that has us bump­ing up against lim­its to for­bid­den research. In fact Karrie Karahalios, who’s going to be speak­ing lat­er today, brought up this amaz­ing top­ic, this idea that we need the abil­i­ty to be able to audit algo­rithms to find out whether they’re racial­ly biased. And it turns out that this bumps into all sorts of legal restric­tions, but also restric­tions about how we’re expect­ed to use web sites. How we think those sites are sup­posed to be used and not used.

And as we dug into this top­ic, we real­ized research gets for­bid­den for all sorts of rea­sons. We’re going to talk about top­ics today that are for­bid­den in some sense because they’re so big, they’re so con­se­quen­tial, that it’s extreme­ly dif­fi­cult for any­one to think about who should actu­al­ly have the right to make this deci­sion. We’re going to talk about some top­ics that end up being off the table, that end up being for­bid­den, because they’re kind of icky. They’re real­ly uncom­fort­able. And frankly, if you make it through this day with­out some­thing mak­ing you uncom­fort­able, we did some­thing wrong in plan­ning this event.

But it’s incred­i­bly impor­tant that we look at these for­bid­den top­ics. We’re at a par­tic­u­lar­ly dark moment in the United States. We’ve just seen an incred­i­ble wave of vio­lence, a wave of vio­lence against peo­ple of col­or at the hands of the police, a wave of vio­lence tar­get­ed at police. This is a real­ly ter­ri­fy­ing moment. It has a lot to do with gun cul­ture and gun vio­lence in the United States, which we can­not study as a pub­lic health issue because Congress in 1996 passed leg­is­la­tion that means that we can­not give mon­ey to the CDC to study gun violence.

And we know that restric­tions on what we can study and what we can research are restric­tions on an open soci­ety, as we’re see­ing right now in Turkey where Erdoğan’s crack­down is focus­ing not only on the judi­cia­ry but is also focus­ing on peo­ple with­in uni­ver­si­ties, includ­ing uni­ver­si­ty deans. 

At a moment when research is for­bid­den, it is incred­i­bly impor­tant that we find ways to be cre­ative­ly and proso­cial­ly dis­obe­di­ent. And what we’re going to see here is an amaz­ing range of proso­cial­ly dis­obe­di­ent researchers on stage, talk­ing about the work that they’re doing and try­ing to fig­ure out how to brave these restric­tions, or think­ing about doing and think­ing about how to do care­ful­ly and ethically.

So I’m incred­i­bly grate­ful for every­one who’s come here to take the stage today. I am par­tic­u­lar­ly grate­ful to our open­ing speak­er, who is a man who as a jour­nal­ist, as a blog­ger, par­tic­u­lar­ly as a sci-fi author, has explored this idea of protest, of activism, of proso­cial dis­obe­di­ence, bet­ter than almost any­one else I can think of. I’d like to wel­come the stage my friend Cory Doctorow.

Help Support Open Transcripts

If you found this useful or interesting, please consider supporting the project monthly at Patreon or once via Cash App, or even just sharing the link. Thanks.