Cory Doctorow: Hi, I’m Cory Doctorow. I write sci­ence fic­tion nov­els for young peo­ple and I work for the Electronic Frontier Foundation. It’s a plea­sure to be back here at DEF CON Kids at r00tz. And the talk today, it’s called You are Not a Digital Native.”

So, you may have heard peo­ple come up to you and say like, Hey, you’re young. That makes you a dig­i­tal native.” Something about being born after the mil­len­ni­um or born after 1995 or what­ev­er, that makes you sort of mys­ti­cal­ly tuned in to what the Internet is for, and any­thing that you do on the Internet must be what the Internet is actu­al­ly for. And I’m here to tell you that you’re not a dig­i­tal native. That you’re just some­one who uses com­put­ers, and you’re no bet­ter and no worse than the rest of us at using com­put­ers. You make some good deci­sions and you make some bad deci­sions. But there’s one respect in which your use of com­put­ers is dif­fer­ent from every­one else’s. And it’s that you are going to have to put up with the con­se­quences of those uses of com­put­ers for a lot longer than the rest of us because we’ll all be dead.

So there’s an amaz­ing researcher named danah boyd. She stud­ies how young peo­ple use the Internet and com­put­er net­works. She’s been doing it for about twen­ty years. She was the first anthro­pol­o­gist to work with a big tech com­pa­ny. She worked at Intel, at Google, at a com­pa­ny called Friendster that’s like, no longer extant. She worked at Facebook. She worked at Twitter. And one of the things that she sees over and over again is that young peo­ple care a whole lot about their pri­va­cy, they just don’t know who they should be pri­vate from, right.

They spend a lot of time wor­ried about being pri­vate from oth­er kids they go to school with. From their teach­ers. From their prin­ci­pals. But they don’t spend a lot of time won­der­ing about how pri­vate they need to be from like, the gov­ern­ment in twen­ty years look­ing back­wards to fig­ure out who their friends are and what they were doing when they were in school and whether or not they’re the wrong kind of per­son. They don’t have a lot of time wor­ry­ing about what a future employ­er might find out about them. But they spend a lot of time wor­ry­ing about what oth­er peo­ple might find out about them, and they go to enor­mous lengths to pro­tect their pri­va­cy.

So danah, she doc­u­ments kids who are hard­core Facebook users. And what they do is every time they get up from Facebook, every time they leave their com­put­er, they resign from Facebook.
And Facebook lets you resign and then keeps your account open in the back­ground for up to six weeks in case you change your mind. And if you come back and you reac­ti­vate your account, you get all your friends back, you get all your posts back, and every­one can see you again. But while you’re resigned no one can see you on Facebook, no one can com­ment on you, and no one can read your stuff. So this is a way of them con­trol­ling their data, and that’s what pri­va­cy is. Privacy isn’t about no one ever know­ing your busi­ness. Privacy’s about you decid­ing who gets to know your busi­ness. And so young peo­ple real­ly do care about their pri­va­cy, but some­times they make bad choic­es.

Now, that mat­ters. Because pri­va­cy is a real­ly big part of how we progress as a soci­ety. You know, there are peo­ple alive today who can remem­ber when things that today are con­sid­ered very good and impor­tant were at one point con­sid­ered ille­gal and even some­thing you could go to jail for.

So one of the great heroes of com­put­er sci­ence, a guy named Alan Turing, the guy who invent­ed mod­ern cryp­tog­ra­phy and helped invent the mod­ern com­put­er, he went to jail and then was giv­en a kind of bioweapon, hor­mone injec­tions, until he killed him­self because he was gay. So that was in liv­ing mem­o­ry. There are peo­ple alive today who remem­ber when that hap­pened to Alan Turing. Today if you’re gay you can be in the armed forces, you can get mar­ried, you can run for office, you can serve all over the world.

The way that we got from it was ille­gal to be gay and even if you were a war hero they would hound you to death if you were gay, to it’s total­ly cool to be gay and you can mar­ry the peo­ple you love, is by peo­ple hav­ing pri­va­cy. And that pri­va­cy allowed them to choose the time and man­ner in which they talked to the peo­ple around them about being gay. They could say, You know, I think that you and I are ready to have this con­ver­sa­tion,” and when they had that con­ver­sa­tion, they could enlist peo­ple to their side. And one con­ver­sa­tion at a time, they made this progress that allowed our soci­ety to come for­ward.

So unless you think that like now in 2017 we’ve solved all of our social prob­lems and we’re not going to have to make any changes after this, then you have to think that like, at some point in your life peo­ple that you love are going to come for­ward and say to you, There’s a secret that I’ve kept from you, some­thing real­ly impor­tant to me. And I want you to help me make that not a secret any­more. I want you to help me change the world so that I can be my true self.” And with­out a pri­vate realm in which that con­ver­sa­tion can take place, those peo­ple will nev­er be able to see the progress that they need. And that means that peo­ple that you love, peo­ple that are impor­tant to you, those peo­ple will go to their graves with a secret in their heart that caused them great sor­row because they could nev­er talk to you about it.

So, we talk about whether or not kids are capa­ble of mak­ing good deci­sions or bad deci­sions, whether effec­tive­ly kids are dumb or smart. And I think the right way to think about this is not whether kids are dumb or smart, but what kids know and what they don’t know. Because kids can be super duper smart. Kids can be amaz­ing rea­son­ers. But kids don’t have a lot of con­text. They haven’t been around long enough to learn a bunch of the stuff that goes on in the world and use that as fuel for their rea­son­ing. That’s why you hear about chil­dren who are amaz­ing chess play­ers or physi­cists or math­e­mati­cians. Because the rules that you need to know to be an amaz­ing chess play­er you can absorb in an hour. And then if you have a first-rate intel­lect you can take those roles and turn them into amaz­ing accom­plish­ments.

But you’ve nev­er heard of a kid who is a prodi­gy of his­to­ry, or law. Because it just takes years and years to read the books you need to know to become that kind of amaz­ing his­to­ri­an or legal schol­ar. And so when kids make choic­es about pri­va­cy, it’s not that they don’t val­ue their pri­va­cy, it’s that they lack the con­text to make the right choic­es about pri­va­cy some­times. And over time they get that con­text and they apply their rea­son­ing, and then they become bet­ter and bet­ter at pri­va­cy.

But there’s some­thing that works against that pri­va­cy, against our abil­i­ty to learn the con­text of pri­va­cy. The first is that our pri­vate deci­sions are sep­a­rat­ed by a lot of time and space from their con­se­quences. So imag­ine try­ing to get bet­ter at kick­ing a foot­ball or hit­ting a base­ball, and every time some­one throws the ball you swing the bat. But before you see where the ball goes you close your eyes, you go home. And a year lat­er some­one tells you whether or not the ball con­nect­ed with the bat and where it went. You would nev­er become a bet­ter bat­ter because the feed­back loop is bro­ken.

Well, when you make a pri­va­cy dis­clo­sure, it might be months or years or decades before that pri­va­cy dis­clo­sure comes back to bite you in the butt. And by that time it’s too late to learn lessons from it.

And if that wasn’t bad enough, there are a whole lot of peo­ple who wish that you wouldn’t have pri­va­cy intu­ition. Who hope that you won’t be good at mak­ing good pri­va­cy choic­es. And those peo­ple often are the ones who are talk­ing about dig­i­tal natives, who want you to believe that what­ev­er dumb pri­va­cy choice you made last week you made because you have the mys­ti­cal knowl­edge of how the Internet should be used and not because you made a dumb mis­take.

And those peo­ple, they tend to say things like, Privacy is not a norm in the 21st cen­tu­ry. If you look around you’ll see that young peo­ple don’t real­ly care about their pri­va­cy.” What they actu­al­ly mean is, I would be a lot rich­er if you didn’t care about your pri­va­cy. My sur­veil­lance plat­form would be a lot more effec­tive if you didn’t take any coun­ter­mea­sures to pro­tect your data.” So it’s real­ly hard to get good at pri­va­cy, part­ly because it’s intrin­si­cal­ly hard, and part­ly because we have a lot of peo­ple who use self-serving BS to try and con­fuse you about whether or not pri­va­cy is good or bad.

Ultimately, pri­va­cy is a team sport. Even if you take on board lots of good pri­va­cy choic­es, even if you use pri­vate mes­sen­gers, even if you use pri­vate email tools like GPG and PGP, even if you full-disk encryp­tion, the peo­ple that you com­mu­ni­cate with, unless they’re also using good pri­va­cy tools, all of the data that you share with them, it’s just hang­ing around out there with no encryp­tion and no pro­tec­tion.

And so you have to bring those peo­ple on board. And that’s where the fact that young peo­ple do care about pri­va­cy even if they don’t know exact­ly who they need to be pri­vate from, that’s where this comes in. Because young people’s nat­ur­al desire to have a space in which they can con­duct their round with­out their par­ents, with­out their teach­ers, with­out fel­low kids look­ing in on them, that’s an oppor­tu­ni­ty for you to enlist them into being more pri­vate, into hav­ing bet­ter con­trol over who can see their data.

So I’m going to leave you with a rec­om­men­da­tion for where to look for those tools. Electronic Frontier Foundation, for whom I work, we have a thing called the Surveillance Self-Defense Kit, ssd​.eff​.org. And it’s bro­ken up into playlists depend­ing on what you’re wor­ried about. Say you’re a jour­nal­ist in a war zone, or maybe you’re kid in a high school, we’ve got dif­fer­ent ver­sions of the self-defense kit for you.

It’s in eleven lan­guages, so you can get your friends— If you come from anoth­er coun­try, you can get friends in oth­er coun­tries to come along with you. And that way you can all play the pri­va­cy team sport, and you can use your youth­ful curios­i­ty to become bet­ter con­sumers of pri­va­cy and bet­ter prac­ti­tion­ers of pri­va­cy. And then the Internet that you inher­it, that will be long dead and you’ll still be liv­ing with, that Internet will be a bet­ter Internet for you. Thank you very much.


Discussion

Oh hey. So, I just found out we've got free t-shirts from the Electronic Frontier Foundation. They have these awesome designs. They have all the best things from the Internet. There's a cat. There's books. And there's CCTV cameras. Which are all the things the Internet has. If anyone has a question, the thing I will redeem the question for in addition to an answer is a t-shirt. So, anyone got a question? Yes. Oh, they're smalls. But you may have a friend who could wear it. Or you could wear it as a hat. Go.

[Audience 1 question inaudible]

Cory Doctorow: How legal is it to decrypt Bluetooth packets? Well you know, r00tz has this amazing code of ethics, that you can see over here, that's about what you should and shouldn't do. And one of those things is you should be messing with your own data but not other people's data. If you're messing with your own data, if you're looking at a tool that you own like a Bluetooth card that you own and analyzing, then in general it's pretty legal. But let me tell you, there are a couple of ways in which this can get you into trouble. And the reason I'm mentioning it is not to scare you but to let you know that EFF has your back.

So there's a law called the Computer Fraud and Abuse Act, CFAA of 1986. And under CFAA… Let me tell you a story of how CFAA came to exist. You ever see a movie called WarGames with Matthew Broderick? So after WarGames came out, people went like, "Oh my god, teenagers are going to start World War III. Congress has to do something." So Congress passed this unbelievably stupid law called the Computer Fraud and Abuse Act in 1986. And what they said is if you use the computer in a way that you don't have authorization for, it's a felony, we can put you in jail. And so that has now been interpreted to say if you violate terms of service, which is the authorization you have to use some web site, you are exceeding your authorization.

But like, every one of us violates terms of service all day long. If you've ever read terms of service, what they say is, "By being dumb enough to be my customer, you agree that I'm allowed to come over to your house and punch your grandmother, wear your underwear, make long-distance calls, and eat all the food in your fridge, right?" And every one of us violates those rules all day long. And when prosecutors decide they don't like you, they can invoke the Computer Fraud and Abuse Act.

Now, we have defended many clients who've been prosecuted under the Computer Fraud and Abuse Act. And we will continue to do so. We are agitating for a law called Aaron's Law. It's named after a kid named Aaron Swartz who was a good friend of ours. He was one of the founders of Reddit. Was an amazing kid. Helped invent RSS when he was 12 years old. And in 2013, after being charged with thirteen felonies and facing a thirty-five year prison sentence for violating the terms of service on MIT's network and downloading academic articles, he killed himself. So we've been fighting in his name to try and reform this law ever since.

We're also involved in fighting to reform a law called the Digital Millennium Copyright Act, or DMCA. And the DMCA has this section called Section 1201. And it says that if there's a system that protects a copyrighted work, that tampering with that system is also a felony that can put you in jail for five years and see you with a $500,000 fine for a first offense. And one year and two weeks ago, we filed a lawsuit against the US government to get rid of that law, too. Because it's being used to stop people from repairing their own cars, figuring out how their computers work, jailbreaking their phones, doing all kinds of normal things that you would expect to be able to do with your property.

Now, in general what you do with your Bluetooth device? That's your own business. But laws like the Computer Fraud and Abuse Act and the Digital Millennium Copyright Act, they get in the way of that. And that's one of the reasons EFF is so involved in trying to kill them. Thanks for your question. Here, take some textiles.

Yeah, go ahead. And I'm going to come down so I can hear you.

[Audience 2 question inaudible]

So the question was what's my favorite science fiction question to write about? And so I'm one of a small but growing number of science fiction writers who write what I call "techno-realism?" So, usually in science fiction for like fifty years, maybe even longer, whenever a science fiction writer wanted to do something in the plot, they'd just stick a computer in and say, "Oh, computers are able to do this thing, or computers aren't able to do this thing," to make the plot go forward, without any actual reference to what computers could do.

And you know, I think that like sixty years ago when computers were the size of buildings and like eleven people in the world had ever used one, that was like an easier kind of magic trick to play without anyone figuring out where you hid the card? But today I think a lot of people know how computers work. And so I like to write stories in which the actual capabilities and constraints of computers are the things that the computers turn on. And so you know, I've got books like Little Brother that are ten years old that people still read as contemporary futuristic science fiction, even though I wrote it twelve years ago, right. And that's because the things that computers can and can't do, theoretically those aren't going to change. Maybe like, how we use them will change, but the underlying computer science theory is pretty sound. So that's how I like to work in my science fiction.

Alright one over here, yes.

[Audience 3 question inaudible]

So the question was how do you know when it's safe to share your information, when it's a good idea? Well you know, that's really hard. Like, I make bad decisions about that all the time. There's been—like, many times I've gone and made a choice to give some data to a web site or whatever, and then I find out that that web site— Like Yahoo!, right? Yahoo! was compromised for five years before they told anyone that they leaked everyone's data. And so it's really hard to know for sure when to do that. I think that some of it is just trying to remember what you thought might happen when you made a disclosure, and then comparing it to what happened.

And some of it I think, though, is that we need companies to be more careful with our data. And one of the ways to do that is to hold them to account when they make bad choices. So right now, a company that breaches all your data, generally they don't owe you any money. Like, Home Depot breached 80 million customers' credit card records. And they had to give each of them about thirty-three cents and a six-month gift certificate for credit monitoring services. So like, if those companies actually had to pay what it would cost their customers over that breach, then I think those companies would be a lot more careful with their data. So some of it is us getting better at making the choice, but some of it is the companies having to pay the cost for being part of that stupid choice, right. Thank you for your excellent question.

Alright, we've got two more t-shirts.

[Audience 4 question inaudible]

So, the question was how can other people get involved in fighting all these fights? So if you're on a university campus, Electronic Frontier Foundation has this network of campus organizations. If you email info@eff.org and say that you'd like to get your campus involved, that's a thing you can do.

Joining EFF's mailing list is really useful. We will send you targeted notices when your lawmaker needs a phone call from their voters saying "get out there and make a good vote."

Obviously like, you know, writing EFF a check is a nice thing to do. We're a member-supported nonprofit. We get our money mostly from small-money donors like you.

Making good privacy choices yourself, and then helping other people around you make those choices. Having this conversation. You know like, people say, "Oh, my mom will never figure out how to use this stuff." And you know, first of all it gives moms a super bad rap. Because people design technology for the convenience of bosses. Like, if a technology can't be used easily by like someone's boss? That person gets in trouble. If a technology can't be used easily by someone's mom, they're like, "Oh, my mom's dumb," right? And so like, moms have to work like a hundred times harder than bosses. And so they're like ninjas compared to bosses. So bosses suck at using technology, not moms.

But there are people in your life, the kind of people who might come to DEF CON, who haven't really thought this stuff through. If you had this conversation with two people in the next month, you would triple the number people in your circle who care about this stuff. Thank you. Have a t-shirt.

Who's got the last question? It has to be really good, but no pressure.

[Audience 5 question inaudible]

So the question was he just got here, can I repeat the entire speech all over again? I can. But it's just the hash of the speech. So it's really short. It's like 0ex117973798777. So you can just compare that to the recording—make sure that it hasn't been tampered with. Alright.

[Audience 6 question inaudible]

So the question was are any of my stories going to be made into movies. I've had a ton of stuff optioned. I've got a thing under option at Paramount right now, Little Brother. There's a Bollywood studio that has an option on my book For the Win. But most stuff that gets optioned doesn't get made. That's like a pull process, not a push process. The author doesn't get to show up and say, "Now you make my movie!" It's more like someone comes along and says, "Maybe we'll make your movie," and you're like, "Okay, give me some money," and then you wait. So I'm just waiting. Thank you.

Alright, thank you all very much. Have a great r00tz.


Help Support Open Transcripts

If you found this useful or interesting, please consider supporting the project monthly at Patreon or once via Square Cash, or even just sharing the link. Thanks.