Cory Doctorow: Hi, I’m Cory Doctorow. I write sci­ence fic­tion nov­els for young peo­ple and I work for the Electronic Frontier Foundation. It’s a plea­sure to be back here at DEF CON Kids at r00tz. And the talk today, it’s called You are Not a Digital Native.” 

So, you may have heard peo­ple come up to you and say like, Hey, you’re young. That makes you a dig­i­tal native.” Something about being born after the mil­len­ni­um or born after 1995 or what­ev­er, that makes you sort of mys­ti­cal­ly tuned in to what the Internet is for, and any­thing that you do on the Internet must be what the Internet is actu­al­ly for. And I’m here to tell you that you’re not a dig­i­tal native. That you’re just some­one who uses com­put­ers, and you’re no bet­ter and no worse than the rest of us at using com­put­ers. You make some good deci­sions and you make some bad deci­sions. But there’s one respect in which your use of com­put­ers is dif­fer­ent from every­one else’s. And it’s that you are going to have to put up with the con­se­quences of those uses of com­put­ers for a lot longer than the rest of us because we’ll all be dead.

So there’s an amaz­ing researcher named danah boyd. She stud­ies how young peo­ple use the Internet and com­put­er net­works. She’s been doing it for about twen­ty years. She was the first anthro­pol­o­gist to work with a big tech com­pa­ny. She worked at Intel, at Google, at a com­pa­ny called Friendster that’s like, no longer extant. She worked at Facebook. She worked at Twitter. And one of the things that she sees over and over again is that young peo­ple care a whole lot about their pri­va­cy, they just don’t know who they should be pri­vate from, right.

They spend a lot of time wor­ried about being pri­vate from oth­er kids they go to school with. From their teach­ers. From their prin­ci­pals. But they don’t spend a lot of time won­der­ing about how pri­vate they need to be from like, the gov­ern­ment in twen­ty years look­ing back­wards to fig­ure out who their friends are and what they were doing when they were in school and whether or not they’re the wrong kind of per­son. They don’t have a lot of time wor­ry­ing about what a future employ­er might find out about them. But they spend a lot of time wor­ry­ing about what oth­er peo­ple might find out about them, and they go to enor­mous lengths to pro­tect their privacy.

So danah, she doc­u­ments kids who are hard­core Facebook users. And what they do is every time they get up from Facebook, every time they leave their com­put­er, they resign from Facebook.
And Facebook lets you resign and then keeps your account open in the back­ground for up to six weeks in case you change your mind. And if you come back and you reac­ti­vate your account, you get all your friends back, you get all your posts back, and every­one can see you again. But while you’re resigned no one can see you on Facebook, no one can com­ment on you, and no one can read your stuff. So this is a way of them con­trol­ling their data, and that’s what pri­va­cy is. Privacy isn’t about no one ever know­ing your busi­ness. Privacy’s about you decid­ing who gets to know your busi­ness. And so young peo­ple real­ly do care about their pri­va­cy, but some­times they make bad choices.

Now, that mat­ters. Because pri­va­cy is a real­ly big part of how we progress as a soci­ety. You know, there are peo­ple alive today who can remem­ber when things that today are con­sid­ered very good and impor­tant were at one point con­sid­ered ille­gal and even some­thing you could go to jail for.

So one of the great heroes of com­put­er sci­ence, a guy named Alan Turing, the guy who invent­ed mod­ern cryp­tog­ra­phy and helped invent the mod­ern com­put­er, he went to jail and then was giv­en a kind of bioweapon, hor­mone injec­tions, until he killed him­self because he was gay. So that was in liv­ing mem­o­ry. There are peo­ple alive today who remem­ber when that hap­pened to Alan Turing. Today if you’re gay you can be in the armed forces, you can get mar­ried, you can run for office, you can serve all over the world.

The way that we got from it was ille­gal to be gay and even if you were a war hero they would hound you to death if you were gay, to it’s total­ly cool to be gay and you can mar­ry the peo­ple you love, is by peo­ple hav­ing pri­va­cy. And that pri­va­cy allowed them to choose the time and man­ner in which they talked to the peo­ple around them about being gay. They could say, You know, I think that you and I are ready to have this con­ver­sa­tion,” and when they had that con­ver­sa­tion, they could enlist peo­ple to their side. And one con­ver­sa­tion at a time, they made this progress that allowed our soci­ety to come forward.

So unless you think that like now in 2017 we’ve solved all of our social prob­lems and we’re not going to have to make any changes after this, then you have to think that like, at some point in your life peo­ple that you love are going to come for­ward and say to you, There’s a secret that I’ve kept from you, some­thing real­ly impor­tant to me. And I want you to help me make that not a secret any­more. I want you to help me change the world so that I can be my true self.” And with­out a pri­vate realm in which that con­ver­sa­tion can take place, those peo­ple will nev­er be able to see the progress that they need. And that means that peo­ple that you love, peo­ple that are impor­tant to you, those peo­ple will go to their graves with a secret in their heart that caused them great sor­row because they could nev­er talk to you about it.

So, we talk about whether or not kids are capa­ble of mak­ing good deci­sions or bad deci­sions, whether effec­tive­ly kids are dumb or smart. And I think the right way to think about this is not whether kids are dumb or smart, but what kids know and what they don’t know. Because kids can be super duper smart. Kids can be amaz­ing rea­son­ers. But kids don’t have a lot of con­text. They haven’t been around long enough to learn a bunch of the stuff that goes on in the world and use that as fuel for their rea­son­ing. That’s why you hear about chil­dren who are amaz­ing chess play­ers or physi­cists or math­e­mati­cians. Because the rules that you need to know to be an amaz­ing chess play­er you can absorb in an hour. And then if you have a first-rate intel­lect you can take those roles and turn them into amaz­ing accomplishments. 

But you’ve nev­er heard of a kid who is a prodi­gy of his­to­ry, or law. Because it just takes years and years to read the books you need to know to become that kind of amaz­ing his­to­ri­an or legal schol­ar. And so when kids make choic­es about pri­va­cy, it’s not that they don’t val­ue their pri­va­cy, it’s that they lack the con­text to make the right choic­es about pri­va­cy some­times. And over time they get that con­text and they apply their rea­son­ing, and then they become bet­ter and bet­ter at privacy.

But there’s some­thing that works against that pri­va­cy, against our abil­i­ty to learn the con­text of pri­va­cy. The first is that our pri­vate deci­sions are sep­a­rat­ed by a lot of time and space from their con­se­quences. So imag­ine try­ing to get bet­ter at kick­ing a foot­ball or hit­ting a base­ball, and every time some­one throws the ball you swing the bat. But before you see where the ball goes you close your eyes, you go home. And a year lat­er some­one tells you whether or not the ball con­nect­ed with the bat and where it went. You would nev­er become a bet­ter bat­ter because the feed­back loop is broken.

Well, when you make a pri­va­cy dis­clo­sure, it might be months or years or decades before that pri­va­cy dis­clo­sure comes back to bite you in the butt. And by that time it’s too late to learn lessons from it. 

And if that was­n’t bad enough, there are a whole lot of peo­ple who wish that you would­n’t have pri­va­cy intu­ition. Who hope that you won’t be good at mak­ing good pri­va­cy choic­es. And those peo­ple often are the ones who are talk­ing about dig­i­tal natives, who want you to believe that what­ev­er dumb pri­va­cy choice you made last week you made because you have the mys­ti­cal knowl­edge of how the Internet should be used and not because you made a dumb mistake. 

And those peo­ple, they tend to say things like, Privacy is not a norm in the 21st cen­tu­ry. If you look around you’ll see that young peo­ple don’t real­ly care about their pri­va­cy.” What they actu­al­ly mean is, I would be a lot rich­er if you did­n’t care about your pri­va­cy. My sur­veil­lance plat­form would be a lot more effec­tive if you did­n’t take any coun­ter­mea­sures to pro­tect your data.” So it’s real­ly hard to get good at pri­va­cy, part­ly because it’s intrin­si­cal­ly hard, and part­ly because we have a lot of peo­ple who use self-serving BS to try and con­fuse you about whether or not pri­va­cy is good or bad.

Ultimately, pri­va­cy is a team sport. Even if you take on board lots of good pri­va­cy choic­es, even if you use pri­vate mes­sen­gers, even if you use pri­vate email tools like GPG and PGP, even if you full-disk encryp­tion, the peo­ple that you com­mu­ni­cate with, unless they’re also using good pri­va­cy tools, all of the data that you share with them, it’s just hang­ing around out there with no encryp­tion and no protection.

And so you have to bring those peo­ple on board. And that’s where the fact that young peo­ple do care about pri­va­cy even if they don’t know exact­ly who they need to be pri­vate from, that’s where this comes in. Because young peo­ple’s nat­ur­al desire to have a space in which they can con­duct their round with­out their par­ents, with­out their teach­ers, with­out fel­low kids look­ing in on them, that’s an oppor­tu­ni­ty for you to enlist them into being more pri­vate, into hav­ing bet­ter con­trol over who can see their data.

So I’m going to leave you with a rec­om­men­da­tion for where to look for those tools. Electronic Frontier Foundation, for whom I work, we have a thing called the Surveillance Self-Defense Kit, ssd​.eff​.org. And it’s bro­ken up into playlists depend­ing on what you’re wor­ried about. Say you’re a jour­nal­ist in a war zone, or maybe you’re kid in a high school, we’ve got dif­fer­ent ver­sions of the self-defense kit for you. 

It’s in eleven lan­guages, so you can get your friends— If you come from anoth­er coun­try, you can get friends in oth­er coun­tries to come along with you. And that way you can all play the pri­va­cy team sport, and you can use your youth­ful curios­i­ty to become bet­ter con­sumers of pri­va­cy and bet­ter prac­ti­tion­ers of pri­va­cy. And then the Internet that you inher­it, that will be long dead and you’ll still be liv­ing with, that Internet will be a bet­ter Internet for you. Thank you very much.


Oh hey. So, I just found out we’ve got free t‑shirts from the Electronic Frontier Foundation. They have these awe­some designs. They have all the best things from the Internet. There’s a cat. There’s books. And there’s CCTV cam­eras. Which are all the things the Internet has. If any­one has a ques­tion, the thing I will redeem the ques­tion for in addi­tion to an answer is a t‑shirt. So, any­one got a ques­tion? Yes. Oh, they’re smalls. But you may have a friend who could wear it. Or you could wear it as a hat. Go.

[Audience 1 ques­tion inaudible]

Cory Doctorow: How legal is it to decrypt Bluetooth pack­ets? Well you know, r00tz has this amaz­ing code of ethics, that you can see over here, that’s about what you should and should­n’t do. And one of those things is you should be mess­ing with your own data but not oth­er peo­ple’s data. If you’re mess­ing with your own data, if you’re look­ing at a tool that you own like a Bluetooth card that you own and ana­lyz­ing, then in gen­er­al it’s pret­ty legal. But let me tell you, there are a cou­ple of ways in which this can get you into trou­ble. And the rea­son I’m men­tion­ing it is not to scare you but to let you know that EFF has your back.

So there’s a law called the Computer Fraud and Abuse Act, CFAA of 1986. And under CFAA… Let me tell you a sto­ry of how CFAA came to exist. You ever see a movie called WarGames with Matthew Broderick? So after WarGames came out, peo­ple went like, Oh my god, teenagers are going to start World War III. Congress has to do some­thing.” So Congress passed this unbe­liev­ably stu­pid law called the Computer Fraud and Abuse Act in 1986. And what they said is if you use the com­put­er in a way that you don’t have autho­riza­tion for, it’s a felony, we can put you in jail. And so that has now been inter­pret­ed to say if you vio­late terms of ser­vice, which is the autho­riza­tion you have to use some web site, you are exceed­ing your authorization.

But like, every one of us vio­lates terms of ser­vice all day long. If you’ve ever read terms of ser­vice, what they say is, By being dumb enough to be my cus­tomer, you agree that I’m allowed to come over to your house and punch your grand­moth­er, wear your under­wear, make long-distance calls, and eat all the food in your fridge, right?” And every one of us vio­lates those rules all day long. And when pros­e­cu­tors decide they don’t like you, they can invoke the Computer Fraud and Abuse Act.

Now, we have defend­ed many clients who’ve been pros­e­cut­ed under the Computer Fraud and Abuse Act. And we will con­tin­ue to do so. We are agi­tat­ing for a law called Aaron’s Law. It’s named after a kid named Aaron Swartz who was a good friend of ours. He was one of the founders of Reddit. Was an amaz­ing kid. Helped invent RSS when he was 12 years old. And in 2013, after being charged with thir­teen felonies and fac­ing a thirty-five year prison sen­tence for vio­lat­ing the terms of ser­vice on MIT’s net­work and down­load­ing aca­d­e­m­ic arti­cles, he killed him­self. So we’ve been fight­ing in his name to try and reform this law ever since.

We’re also involved in fight­ing to reform a law called the Digital Millennium Copyright Act, or DMCA. And the DMCA has this sec­tion called Section 1201. And it says that if there’s a sys­tem that pro­tects a copy­right­ed work, that tam­per­ing with that sys­tem is also a felony that can put you in jail for five years and see you with a $500,000 fine for a first offense. And one year and two weeks ago, we filed a law­suit against the US gov­ern­ment to get rid of that law, too. Because it’s being used to stop peo­ple from repair­ing their own cars, fig­ur­ing out how their com­put­ers work, jail­break­ing their phones, doing all kinds of nor­mal things that you would expect to be able to do with your property.

Now, in gen­er­al what you do with your Bluetooth device? That’s your own busi­ness. But laws like the Computer Fraud and Abuse Act and the Digital Millennium Copyright Act, they get in the way of that. And that’s one of the rea­sons EFF is so involved in try­ing to kill them. Thanks for your ques­tion. Here, take some textiles.

Yeah, go ahead. And I’m going to come down so I can hear you.

[Audience 2 ques­tion inaudible]

So the ques­tion was what’s my favorite sci­ence fic­tion ques­tion to write about? And so I’m one of a small but grow­ing num­ber of sci­ence fic­tion writ­ers who write what I call techno-realism?” So, usu­al­ly in sci­ence fic­tion for like fifty years, maybe even longer, when­ev­er a sci­ence fic­tion writer want­ed to do some­thing in the plot, they’d just stick a com­put­er in and say, Oh, com­put­ers are able to do this thing, or com­put­ers aren’t able to do this thing,” to make the plot go for­ward, with­out any actu­al ref­er­ence to what com­put­ers could do.

And you know, I think that like six­ty years ago when com­put­ers were the size of build­ings and like eleven peo­ple in the world had ever used one, that was like an eas­i­er kind of mag­ic trick to play with­out any­one fig­ur­ing out where you hid the card? But today I think a lot of peo­ple know how com­put­ers work. And so I like to write sto­ries in which the actu­al capa­bil­i­ties and con­straints of com­put­ers are the things that the com­put­ers turn on. And so you know, I’ve got books like Little Brother that are ten years old that peo­ple still read as con­tem­po­rary futur­is­tic sci­ence fic­tion, even though I wrote it twelve years ago, right. And that’s because the things that com­put­ers can and can’t do, the­o­ret­i­cal­ly those aren’t going to change. Maybe like, how we use them will change, but the under­ly­ing com­put­er sci­ence the­o­ry is pret­ty sound. So that’s how I like to work in my sci­ence fiction.

Alright one over here, yes.

[Audience 3 ques­tion inaudible]

So the ques­tion was how do you know when it’s safe to share your infor­ma­tion, when it’s a good idea? Well you know, that’s real­ly hard. Like, I make bad deci­sions about that all the time. There’s been—like, many times I’ve gone and made a choice to give some data to a web site or what­ev­er, and then I find out that that web site— Like Yahoo!, right? Yahoo! was com­pro­mised for five years before they told any­one that they leaked every­one’s data. And so it’s real­ly hard to know for sure when to do that. I think that some of it is just try­ing to remem­ber what you thought might hap­pen when you made a dis­clo­sure, and then com­par­ing it to what happened. 

And some of it I think, though, is that we need com­pa­nies to be more care­ful with our data. And one of the ways to do that is to hold them to account when they make bad choic­es. So right now, a com­pa­ny that breach­es all your data, gen­er­al­ly they don’t owe you any mon­ey. Like, Home Depot breached 80 mil­lion cus­tomers’ cred­it card records. And they had to give each of them about thirty-three cents and a six-month gift cer­tifi­cate for cred­it mon­i­tor­ing ser­vices. So like, if those com­pa­nies actu­al­ly had to pay what it would cost their cus­tomers over that breach, then I think those com­pa­nies would be a lot more care­ful with their data. So some of it is us get­ting bet­ter at mak­ing the choice, but some of it is the com­pa­nies hav­ing to pay the cost for being part of that stu­pid choice, right. Thank you for your excel­lent question.

Alright, we’ve got two more t‑shirts.

[Audience 4 ques­tion inaudible]

So, the ques­tion was how can oth­er peo­ple get involved in fight­ing all these fights? So if you’re on a uni­ver­si­ty cam­pus, Electronic Frontier Foundation has this net­work of cam­pus orga­ni­za­tions. If you email info@​eff.​org and say that you’d like to get your cam­pus involved, that’s a thing you can do.

Joining EFF’s mail­ing list is real­ly use­ful. We will send you tar­get­ed notices when your law­mak­er needs a phone call from their vot­ers say­ing get out there and make a good vote.”

Obviously like, you know, writ­ing EFF a check is a nice thing to do. We’re a member-supported non­prof­it. We get our mon­ey most­ly from small-money donors like you. 

Making good pri­va­cy choic­es your­self, and then help­ing oth­er peo­ple around you make those choic­es. Having this con­ver­sa­tion. You know like, peo­ple say, Oh, my mom will nev­er fig­ure out how to use this stuff.” And you know, first of all it gives moms a super bad rap. Because peo­ple design tech­nol­o­gy for the con­ve­nience of boss­es. Like, if a tech­nol­o­gy can’t be used eas­i­ly by like some­one’s boss? That per­son gets in trou­ble. If a tech­nol­o­gy can’t be used eas­i­ly by some­one’s mom, they’re like, Oh, my mom’s dumb,” right? And so like, moms have to work like a hun­dred times hard­er than boss­es. And so they’re like nin­jas com­pared to boss­es. So boss­es suck at using tech­nol­o­gy, not moms.

But there are peo­ple in your life, the kind of peo­ple who might come to DEF CON, who haven’t real­ly thought this stuff through. If you had this con­ver­sa­tion with two peo­ple in the next month, you would triple the num­ber peo­ple in your cir­cle who care about this stuff. Thank you. Have a t‑shirt.

Who’s got the last ques­tion? It has to be real­ly good, but no pressure.

[Audience 5 ques­tion inaudible]

So the ques­tion was he just got here, can I repeat the entire speech all over again? I can. But it’s just the hash of the speech. So it’s real­ly short. It’s like 0ex117973798777. So you can just com­pare that to the recording—make sure that it has­n’t been tam­pered with. Alright. 

[Audience 6 ques­tion inaudible]

So the ques­tion was are any of my sto­ries going to be made into movies. I’ve had a ton of stuff optioned. I’ve got a thing under option at Paramount right now, Little Brother. There’s a Bollywood stu­dio that has an option on my book For the Win. But most stuff that gets optioned does­n’t get made. That’s like a pull process, not a push process. The author does­n’t get to show up and say, Now you make my movie!” It’s more like some­one comes along and says, Maybe we’ll make your movie,” and you’re like, Okay, give me some mon­ey,” and then you wait. So I’m just wait­ing. Thank you.

Alright, thank you all very much. Have a great r00tz.