So, when I speak in places where the first language of the nation is not English, there is a disclaimer and an apology, because I am one of nature’s fast talkers. When I was at the United Nations at the World Intellectual Property Organization, I was known as the scourge of the simultaneous translation corps. I would stand up and speak and turn around, and there would be window after window of translator. And every one of them would be doing this. [places face in hand] So in advance, I give you permission when I start talking quickly to do this [waves hands in air] and I will slow down.
So, tonight’s talk is not a copyright talk. I do copyright talks all the time. Questions about culture and creativity are interesting enough, but to be honest I’m quite sick of them. If you want to hear freelancer writers like me bang on about what’s happening to the way we earn our living, by all means go and find one of the many talks I’ve done on this subject on YouTube. But tonight I want to talk about something more important. I want to talk about general purpose computers. Because general purpose computers are in fact astounding. So astounding that our society is still struggling to come to grips with them. To figure out what they’re for. To figure out how to accommodate them and how to cope with them. Which, unfortunately, brings me back to copyright.
Because the general shape of the copyright wars and the lessons they can teach us about the upcoming fights over the destiny of the general purpose computer are important. In the beginning, we had packaged software and the attendant industry, and we had Sneakernet. So we had floppy disks in Ziploc bags, or in cardboard boxes, hung on pegs in shops and sold like candy bars and magazines. And they were eminently susceptible to duplication, and so they were duplicated quickly and widely, and this was to the great chagrin of people who made and sold software.
Enter DRM 0.96. They started to introduce physical defects to the disks, or started to insist on other physical indicia which the software could check for. Dongles, hidden sectors, challenge/response protocols that required that you had physical possession of large, unwieldy manuals that were difficult to copy. And of course these failed, for two reasons. First, they were commercially unpopular of course, because they reduced the usefulness of the software to the legitimate purchasers, while leaving the people who took the software without paying for it untouched. The legitimate purchasers resented the non‐functionality of their backups. They hated the loss of scarce ports to the authentication dongles. And they resented the inconvenience of having to transport large manuals when they wanted to run their software.
And second, these didn’t stop pirates, who found it trivial to patch the software and bypass authentication. Typically, the way that happened is some expert who had possession of technology and expertise of equivalent sophistication to the software vendor itself, would reverse engineer the software and release cracked versions that quickly became widely circulated. While this kind of expertise and technology sounded highly specialized, it really wasn’t. Figuring out what recalcitrant programs were doing, and routing around the defects in shitty floppy disk media were both core skills for computer programmers, and were even more so in the era of fragile floppy disks and the rough‐and‐ready early days of software development. Anti‐copying strategies only became more fraught as networks spread. Once we had BBSes, online services, Usenet newsgroups, and mailing lists, the expertise of people who figured out how to defeat these authentication systems could be packaged up in software and passed around as little crack files, or as the network capacity increased, the cracked disk images or executables themselves could be spread on their own.
Which gave us DRM 1.0. By 1996, it became clear to everyone in the halls of power that there was something important about to happen. We were about to have an information economy, whatever the hell that was. They assumed it meant an economy where we bought and sold information. Now, information technology makes things efficient, so imagine the markets that an information economy would have. You could buy a book for a day. You could sell the right to watch the movie for one euro. And then you could rent out the pause button at one penny per second. You could sell movies for one price in one country, and another price in another, and so on, and so on. The fantasies of those days were a little like a boring science fiction adaptation of the Old Testament book of Numbers, a kind of tedious enumeration of every permutation of things people do with information, and the ways we could charge them for it.
But none of this would be possible unless we could control how people use their computers and the files that we transfer to them. After all, it was well and good to talk about selling someone the twenty‐four hour right to a video, or the right to move music onto an iPod, but not the right to move music from the iPod onto another device. But how the hell could you do that once you’d given them the file?
In order to do that, to make this work, you needed to figure out how to stop computers from running certain programs and inspecting certain files and processes. For example, you could encrypt the file, and then require the user to run a program that only unlocked the file under certain circumstances. But as they say on the Internet, now you have two problems. You also now have to stop the user from saving the file while it’s in the clear, and you have to stop the user from figuring out where the unlocking program stores its keys, because if the user finds the keys she’ll just decrypt the file and throw away that stupid player app.
And now you have three problems. Because now you have to stop the users who figure out how to render the file in the clear from sharing it with other users. And now you’ve got four problems. Because now you have to stop the users who figure out how to extract secrets from unlocking programs from telling other users how to do it, too. And now you’ve got five problems. Because now you have to stop users who figure out how to extract secrets from unlocking programs from telling other users what the secrets were. That’s a lot of problems.
But by 1996, we had a solution. We had the WIPO Copyright Treaty, passed by the United Nations World Intellectual Property Organization, which created laws that made it illegal to extract secrets from unlocking programs. And it created laws that made it illegal to extract media cleartexts from the unlocking programs while they were running. And it created laws that made it illegal to tell people how to extract secrets from unlocking programs. And it created laws that made it illegal to host copyrighted works, and secrets, and all with a handy streamlined process that let you remove stuff from the Internet without having to screw around with lawyers and judges and all that crap.
And with that, illegal copying ended forever [laughter], the information economy blossomed into a beautiful flower that brought prosperity to the whole wide world. As they say on the aircraft carriers, “Mission accomplished”. [laughter]
Well, of course that’s not how the story ends, because pretty much anyone who understood computers and networks understood that while these laws would create more problems than they could possibly solve. After all, these were laws that made it illegal to look inside your computer when it was running certain programs. They made it illegal to tell people what you found when you looked inside your computer. They made it easy to censor material on the Internet without having to prove that anything wrong had happened. In short, they made unrealistic demands on reality and reality did not oblige them.
After all, copying only got easier following the passage of these laws. Copying will only ever get easier. Here, 2011, this is as hard as copying will get! Your grandchildren will turn to you around the Christmas table and say, “Tell me again, Grandpa. Tell me again, Grandma, about when it was hard to copy things in 2011. When you couldn’t get a drive the size of your fingernail that could hold every song ever recorded, every movie ever made, every word ever spoken, every picture ever taken, everything, and transfer it in such a short period of time you didn’t even notice it was doing it. Tell us again when it was so stupidly hard to copy things back in 2011.”
And so, reality asserted itself, and everyone had a good laugh over how funny our misconceptions were when we entered the 21st century and then a lasting peace was reached with freedom and prosperity for all.
Well, not really. Because, like the nursery rhyme lady who swallows a spider to catch a fly, and has to swallow a bird to catch the spider, and a cat to catch the bird, and so on, so must a regulation that has broad general appeal but is disastrous in its implementation beget a new regulation aimed at shoring up the failure of the old one.
Now, it’s tempting to stop the story here and conclude that the problem is that lawmakers are either clueless or evil, or possibly evilly clueless, and just leave it there, which is not a very satisfying place to go because it’s fundamentally a counsel of despair. It suggests that our problems cannot be solved for so long as stupidity and evilness are present in the halls of power, which is to say they will never be solved.
But I have another theory about what’s happened. It’s not that regulators don’t understand information technology. Because it should be possible to be a non‐expert and still make a good law. MPs and congressmen and so on are elected to represent districts and people, not disciplines and issues. We don’t have a member of Parliament for biochemistry, and we don’t have a senator from the great state of urban planning, and we don’t have an MEP from child welfare. But perhaps we should. And yet those people who are experts in policy and politics, not technical disciplines, nevertheless often do manage to pass good rules that make sense. And that’s because government relies on heuristics, rules of thumb about how to balance out expert input from different sides of an issue.
But information technology confounds these heuristics. It kicks the crap out of them in one important way, and this is it: one important test of whether or not a regulation is fit for purpose is first whether, of course, it will work, but second of all whether or not in the course of doing its work it will have lots of effects on everything else. If I wanted Congress or Parliament to write, or the EU to regulate, a wheel, it’s unlikely I’d succeed. If I turned up and said, “Well, everyone knows wheels are good and right. But have you noticed that every single bank robber has four wheels on his car when he drives away from the bank robbery? Can’t we do something about this?” The answer would of course be no. Because we don’t know how to make a wheel that is still generally useful for legitimate wheel applications but useless to bad guys. And we can all see that the general benefits of wheels are so profound that we’d be foolish to risk them in a foolish errand to stop bank robberies by changing wheels. Even if there were an epidemic of bank robberies, even if society were on the verge of collapse thanks to bank robberies, no one would think that wheels were the right place to start solving our problems.
But, if I were to show up in that same body to say that I had absolute proof that hands‐free phones were making cars dangerous, and I said, “I would like you to pass a law that says it’s illegal to put a hands‐free phone in a car,” the regulator might say, “Yeah, I take your point. We’ll do that.” And we might disagree about whether or not this is a good idea or whether or not my evidence made sense, but very few of us would say, “Well, once you take the hands‐free phones out of the car, they stop being cars.” We understand that we can keep cars cars even if we remove features from them. Cars are special‐purpose, at least in comparison to wheels, and all that the addition of a hands‐free phone does is add one more feature to an already specialized technology.
In fact, there’s that heuristic that we can apply here. Special‐purpose technologies are complex. And you can remove features from them without doing fundamental disfiguring violence to their underlying utility. This rule of thumb serves regulators well by and large, but it is rendered null and void by the general‐purpose computer and the general‐purpose network, the PC and the Internet. Because if you think of computer software as a feature, that is a computer with spreadsheets running on it has a spreadsheet feature, and one that’s running World of Warcraft has an MMORPG feature, then this heuristic leads you to think that you could reasonably say, “Make me a computer that doesn’t run spreadsheets,” and that it would be no more of an attack on computing than, “Make me a car without a hands‐free phone” is an attack on cars.
And if you think of protocols and sites as features of the network, then saying, “Fix the Internet so that it doesn’t run BitTorrent,” or, “Fix the Internet so that thepiratebay.org no longer resolves,” it sounds a lot like “Change the sound of busy signals,” or, “Take that pizzeria on the corner off the phone network,” and not like an attack on the fundamental principles of internetworking.
Not realizing that this rule of thumb that works for cars and houses and every other substantial area of technological regulation fails for the Internet does not make you evil and it does not make you an ignoramus. It just makes you part of that vast majority of the world for whom ideas like “Turing complete” and “end‐to‐end” are meaningless. So our regulators go off and they blithely pass these laws, and they become part of the reality of our technological world.
There are suddenly numbers that we aren’t allowed to write down on the Internet, programs we’re not allowed to publish. And all it takes to make legitimate material disappear from the Internet is to say, “That? That infringes copyright.” It fails to attain the actual goal of the regulation. It doesn’t stop people from violating copyright, but it bears a kind of superficial resemblance to copyright enforcement. It satisfies the security syllogism. Something must be done; I am doing something; something has been done. And thus, any failures that arise can be blamed on the idea that the regulation doesn’t go far enough rather than the idea that it was flawed from the outset.
This kind of superficial resemblance and underlying divergence happens in other engineering contexts. I have a friend who was once a senior executive at a big consumer packaged goods company who told me about what happened when the marketing department told the engineers that they’d thought up a great idea for detergent. From now on, they were going to make detergent that made your clothes newer every time you washed them.
Well, after the engineers had tried unsuccessfully to convey the concept of entropy to the marketing department, they arrived at another solution. (“Solution.”) They developed a detergent that used enzymes that attacked loose fiber ends, the kind that you get with broken fibers that make your clothes look old. So every time you washed your clothes in the detergent, they would look newer, but that was because the detergent was literally digesting your clothes. Using it would cause your clothes to literally dissolve in the washing machine. This was the opposite of making clothes newer. Instead, you were artificially aging your clothes every time you washed them. And as a user, the more you deployed the “solution,” the more drastic your measures had to be to keep your
clothes up to date. You actually had to go buy new clothes because the old ones fell apart.
So today we have marketing departments who say things like, “We don’t need computers, we need…appliances. Make me a computer that doesn’t run every program, just a program that does this specialized task like streaming audio or routing packets or playing Xbox games. And make sure it doesn’t run programs that I haven’t authorized that might undermine our profits.” And on the surface, this seems like a reasonable idea. Just a program that does one specialized task. After all, we can put an electric motor in a blender, and we can install a motor in a dishwasher, and we don’t worry whether it’s still possible to run a dishwashing program in a blender.
But that’s not what we do when we turn a computer into an “appliance.” We’re not making a computer that runs only the “appliance” app. We’re making a computer that can run every program, but which uses some combination of rootkits, spyware, and code‐signing to prevent the user from knowing which processes are running, from installing her own software, and from terminating processes that she doesn’t want. In other words, an appliance is not a stripped‐down computer. It is a fully‐functional computer with spyware on it out of the box. [applause] Thanks.
Because we don’t know how to build the general purpose computer that is capable of running any program we can compile except for some program that we don’t like, or that we prohibit by law, or that loses us money. The closest approximation we have to this is a computer with spyware. A computer on which remote parties set policies without the computer user’s knowledge, over the objection of the computer’s owner. And so it is that digital rights management always converges on malware.
There was of course this famous incident, a kind of gift to people who have this hypothesis, in which Sony loaded covert rootkit installers on six million audio CDs, which secretly executed programs that watched for attempts to read the sound files on CDs and terminated them, and which also hid the rootkit’s existence by causing the kernel to lie about which processes were running, and which files were present on the drive.
But it’s not the only example. Just recently, Nintendo shipped the 3DS, which opportunistically updates its firmware and does an integrity check to make sure that you haven’t altered the old firmware in any way. And if it detects signs of tampering, it bricks itself.
Human rights activists have raised alarms over UEFI, the new PC bootloader that restricts your computer so it runs signed operating systems, noting that repressive governments will likely withhold signatures from OSes unless they have covert surveillance operations.
And on the network side, attempts to make a network that can’t be used for copyright infringement always converges with the surveillance and control measures that we know from repressive governments. So, SOPA, the American US Stop Online Piracy Act, bans tools like DNSSEC because they can be used to defeat DNS‐blocking measures. And it bans tools like Tor, because they can be used to circumvent IP‐blocking measures. In fact, the proponents of SOPA, the Motion Picture Association of America, circulated a memo citing research that SOPA would probably work, because it uses the same measures as are used in Syria, China, and Uzbekistan. And they argued that these measures are effective in those countries, and so they would work in America, too. [applause] Don’t applaud me, applaud the MPAA.
Now, it may seem like SOPA is the end game in a long fight over copyright and the Internet, and it may seem like if we defeat SOPA, we’ll be well on our way to securing the freedom of PCs ad networks. But as I said at the beginning of this talk, this isn’t about copyright. Because the copyright wars are just the 0.9 beta version of the long coming war on computation. The entertainment industry were just the first belligerents in this coming century‐long conflict. We tend to think of them as particularly successful. After all, here is SOPA, trembling on the verge of passage, breaking the Internet on this fundamental level in the name of preserving Top 40 music, reality TV shows, and Ashton Kutcher movies.
But the reality is that copyright legislation gets as far as it does precisely because it’s not taken seriously. Which is why on one hand Canada has had Parliament after Parliament introduce one stupid copyright law after another. But on the other hand, Parliament after Parliament has failed to actually vote on the bill. It’s why we got SOPA, a bill composed of pure stupid, pieced together molecule by molecule into a kind of Stupidite 250 that is normally only found in the heart of a newborn star.
And it’s why the rushed‐through SOPA hearings had to be adjourned midway through the Christmas break so that lawmakers could get into a real vicious, nationally‐infamous debate over an important issue, unemployment insurance. It’s why the World Intellectual Property Organization is gulled time and again into enacting crazed, pig‐ignorant copyright proposals. Because when the nations of the world send their UN missions to Geneva, they send water experts, not copyright experts. They send health experts, not copyright experts. They send agriculture experts, not copyright experts. Because copyright is just not important to pretty much everyone. [applause]
Canada’s Parliament didn’t vote on its copyright bills because of all the things that Canada needs to do, fixing copyright ranks well below resolving health emergencies on First Nations reservations, exploiting the oil patch in Alberta, interceding in sectarian resentments among French and English‐speakers, solving resource crises in the nation’s fisheries, and a thousand other issues. The triviality of copyright tells you that when other sectors of the economy start to evince concerns about the Internet and the PC, that copyright will be revealed for a minor skirmish and not a war.
Why would other sectors nurse grudges against computers? Well, because the world we live in today is made of computers. We don’t have cars anymore, we have computers we ride in. We don’t have airplanes anymore, we have flying Solaris boxes with a big bucketful of SCADA controllers. A 3D printer is not a device. It’s a peripheral, and it only works connected to a computer. A radio is no longer a crystal. It’s a general‐purpose computer with a fast ADC and a fast DAC and some software. The grievances that arose from unauthorized copying are trivial when compared to the calls for action that our new computer‐embroidered reality will create.
Think of radio for a minute. The entire basis for radio regulation up until today was based on the idea that the properties of a radio are fixed at the time of manufacture and can’t be easily altered. You can’t just flip a switch on your baby monitor and turn it into something that interferes with air traffic control signals. But powerful software‐defined radios can change from baby monitor to emergency services dispatcher to air traffic controller, just by loading and executing different software. Which is why the first time the American telecoms regulator, the FCC, considered what would happen when we put SDRs in the field, they asked for comment on whether it should mandate that all software‐defined radios should be embedded in trusted computing machines. Ultimately, whether every PC should be locked so that the programs they run are strictly regulated by central authorities.
And even this is a shadow of what is to come. After all, this was the year in which we saw the debut of open‐sourced shape files for converting AR‐15s to full automatic. This was the year of crowdfunded open‐sourced hardware for gene sequencing. And while 3D printing will give rise to plenty of trivial complaints, there will be judges in the American South and mullahs in Iran who will lose their minds over people in their jurisdiction printing out sex toys. The trajectory of 3D printing will most certainly raise real grievances, from solid‐state meth labs, to ceramic knives.
And it doesn’t take a science fiction writer to understand why regulators might be nervous about the user‐modifiable firmware on self‐driving cars, or limiting interoperability for aviation controllers, or the kind of thing you could do with bio‐scale assemblers and sequencers. Imagine what will happen the day that Monsanto determines that it’s really, really important to make sure that computers can’t execute programs that cause specialized peripherals to output organisms that eat their lunch, literally.
Regardless of whether you think these are real problems or merely hysterical fears, they are nevertheless the province of lobbies and interest groups that are far more influential than Hollywood and Big Content are on their best day, and every one of them will arrive at the same place. “Can’t you just make us a general‐purpose computer that runs all the programs except for the ones that scare and anger us? Can’t you just make us an Internet that transmits any message over any protocol between any two points, unless it upsets us?” And personally, I can see that there will be programs that run on general‐purpose computers and peripherals that will even freak me out. So I can believe that people who advocate for limiting general‐purpose computers will find receptive audiences for their positions.
But just as we saw with the copyright wars, banning certain instructions or protocols or messages, will be wholly ineffective as a means of prevention and remedy. And as we saw in the copyright wars, all attempts at controlling PCs will converge on rootkits. All attempts at controlling the Internet will converge on surveillance and censorship. Which is why all this stuff matters. Because we’ve spent the last ten‐plus years as a body sending our best players out to fight what we thought was the final boss at the end of the game, but it
turns out it’s just been the mini‐boss at the end of the level.
And the stakes are only going to get higher. As a member of the Walkman generation, I have made peace with the fact that I will require a hearing aid long before I die. And of course, it won’t be a hearing aid. It will be a computer I put in my body. So when I get into a car (a computer I put my body into), with my hearing aid (a computer I put inside my body), I want to know that these technologies are not designed to keep secrets from me and to prevent me from terminating processes on them that work against my interests. [applause] Thank you.
So last year, the Lower Merion School District, in a middle‐class, affluent suburb of Philadelphia found itself in a great deal of trouble, because it was caught distributing PCs to its students equipped with rootkits that allowed for remote covert surveillance through the computer’s camera and network connection. It transpired that they had been photographing students thousands of times, at home and at school, awake and asleep, dressed and naked. Meanwhile, the latest generation of lawful intercept technology can covertly operate cameras, mics, and GPSes on PCs, tablets, and mobile devices. Freedom in the future will require us to have the capacity to monitor our devices and set meaningful policy on them, to examine and terminate the processes that run on them, to maintain them as honest servants to our will, and not as traitors and spies working for criminals, thugs, and control freaks.
And we haven’t lost yet, but we have to win the copyright wars to keep the Internet and the PC free and open. Because these are the matériel in the wars that are to come. We won’t be able to fight on without them. And I know this sounds like a counsel of despair. But as I said, these are early days. We have been fighting the mini‐boss, and that means great challenges are yet to come. But like all good level designers, fate has sent us a soft target to train ourselves on. We have a chance, a real chance, and if we support open and free systems and the organizations that fight for them—EFF, Bits of Freedom, EDRi, [Org?] , CCC, Netzpolitik, La Quadrature du Net, and all the others who are thankfully too numerous to name here—we may yet win the battle, and secure the ammunition we’ll need for the war. Thank you.
Cory Doctorow: So, either questions or long rambling statements followed by, "What do you think of that?" Any questions?
Audience 1: So if you game this out all the way to the end, you end up with a situation where either the censorship people have to outlaw von Neuman and Harvard architectures and replace them with something that's not a universal Turing machine… Or they lose, full stop. And there's a big spectrum in between the two. Don't let me distract from that. I'm not talking about the very very last bastion line of freedom there. Do you think a bunch of assholes that don't even understand how DNS works are going to be willing to shoot themselves in the foo—are going to be going to shoot themselves in the head that hard?
Doctorow: So, I guess my answer is that the fact that there's no such thing as witchcraft didn't stop them from burning a lot of witches, right?
Audience 1: Right, right.
Doctorow: So by the same token, I think the ineffectiveness of the remedy is actually even worse for us, right. Because this is like the five-year plan that produces no wheat, that yields an even more drastic five-year plan that also produces no corn. I mean, this will make them angrier and cause them to expand the scope of the regulation. You know, "the beatings will continue until morale improves" as the t-shirt goes, right. That's actually my worry. I think if they saw some success they might actually back off. The fact that this will be a dismal failure over and over and over again, the fact that terrorists will continue to communicate terrorist messages, and child pornographers will continue to communicate child pornographic messages and so on will, just make them try harder at ineffective remedies.
Audience 1: Yeah. I mean, a specialized Turing machine on an ASIC is actually really really hard. Because you have to make one for every application, and that sucks.
Doctorow: Yeah. So again, I don't think they are going to ban general purpose computers. I think what they're going to do is they're going to say, "We want more spyware in computers. We want more UEFI." And not just UEFI that helps you detect spyware but UEFI where the signing is controlled by third parties, and you don't have an easy owner override, and all the rest of it. I think that that's going to be the trajectory of this stuff, not, "Gosh that stupid policy that we pursued at great expense for ten years was a complete failure. We should admit it and move on." I think that the answer is going to be, "Oh my god. Look at what idiots we looked like. We can't possibly admit defeat." You know, see: the War on Drugs.
Moderator: We've actually got quite a bit of time here so long, rambling statements are cool. So, next question.
Audience 2: Regarding the recent initative by a big software company to promote to secure boot on UEFI, do you think that personal computers will arrive like the situation on the PlayStation or platforms soon? And do think that we'll have some means to counterattack or to…
Doctorow: Yeah, so the question is really is UEFI going to be a means of freezing out alternative operating systems on the desktop? And I kind of feel like kind of…technocratic, well-educated, Western, Northern, middle-class people are always going to be able to figure out how to get around this stuff. What I'm more concerned about, not least because I think organizations like the FTC will probably object pretty strenuously unless there's…you know, you can take the lid off and press a little red button to reset UEFI, which is what they're talking about now. But I think it's way more likely that repressive governments are going to say, "Any boards that are imported into our country," which will be most of them, not all of them, but most of them, "will have to only run OSes signed by our certificate authority, and our certificate authority will say unless you've got spyware you can't import the machine."
Audience 2: But do think it will be illegal to reverse engineer, for example to defeat the secure boot—
Doctorow: I mean, that's an an interesting question. I kind of put that to some people who were involved in the Software Freedom Law Center, where they've been working on this. And my feeling is that that would be the right kind of stupid law. Because it would be one that I think is pretty coherent with existing free speech and code questions like Bernstein and so on. And it would probably actually knock back anti-circumvention provisions in most places where you tried it. I think that that would be the kind of case we could maybe win, as opposed to some of the harder ones. So yeah, I'm not super worried about that, I'm moderately worried. I think if you are worried, the right place to go is the Software Freedom Law Center. This is their big issue for the year. So, they need your donations, it's the end of the year.
Audience 3: Hi. Don't you think that after the dust settles of all this idiocy over the Internet, that in the end it will be something like the law of the sea? That you have exclusive economic zones where some states and policymakers will run wild with spyware and everything. And then you have a more or less lawless common zone, where there's absolute freedom because not only the individuals will need them but also companies, in the end, and everyone else?
Doctorow: Well, I guess companies don't need lots of freedom if all you want is incumbents to stay where they are, right. I mean, that was the thing that freaks everyone out about Singapore, is that there is this doctrine that said free markets brought freedom. And one always led to the other, and one required the other, and Singapore showed you that you could actually have vibrant, thriving markets that also didn't have any kind of freedom and where you could have serious regulation. China, too, I think.
So I don't know that the kind of historical forces you're describing, you know, companies will demand freedom therefore we will have lawless zones, are necessarily true. I think if there's one thing companies generally want, it's monopoly not freedom. It's lots of regulation that gores everyone else's ox and supports their own business model. That's always been the case.
Audience 3: And on an organizational idea, that you say, sort of in an international law perspective, let's say. A treaty of digital high seas? Something like that?
Doctorow: I guess that the difference is that the Internet's not the ocean, right. Things that happen in the middle of the Pacific Ocean, their light cone, to borrow some physics jargon here, the light cone may never in fact reach shore. And if it does, it could take a very long time, especially when the "law of the seas" was being countenanced, when it take weeks and months.
The Internet's more like a web, right, where everything that trembles on the web makes the whole web shake. And so the fact that something is happening in the "lawless zone" is unlikely to have no intersection with what's happening in the lawful zones. I don't know that that's a great answer. I think maybe laws that respect freedom everywhere that we can get them and using the Internet to try to expand that sphere might be a better way to go.
Audience 4: The problem I see that is that we're looking at a system that is getting more and more complex but more and more broken on the way, too. And not just the Internet but everything. The production of goods of big companies; we have two produces of CPUs, and maybe seven who produce memory. And so if Intel or AMD decide okay, we just ship every product we make with UEFI, with signed firmware, we're fucked.
So the problem is that we are as humanity not able to produce goods that we understand and we can use as we want. So I think that the effort we should really put on distributing networks and building our own hardware, because we have the Internet, but nowadays 99.9% of the population think the Web is the Internet, and they think Facebook is the best thing that was invented in human existence. So we have to really break things down much more again and try to put out open source solutions for every problem. I know this is a lot of effort and not really convenient because you don't have this great integration that Apple will bring you, but yeah maybe just as a thought.
Doctorow: Well, I think that the thing about Facebook is that it works incredibly well, it just fails very badly. So all the things that it's good at, it's really good at. But when it fails, it destroys your personal life or it allows all of your friends to be rounded up by the Syrian secret police and tortured and murdered or whatever, right. I mean, there's lots of ways in which Facebook is unfit for purpose.
But we have to understand why people use it. They use it because it works well. And if we want to convince people that proprietary or difficult technologies are likely to bite them in the ass in the future, we have to convey to them its failure modes. And that's the tricky thing. And of course this is not a new problem to computers, although maybe the stakes are a little higher. This is the problem with smoking, right? If you got cancer as soon as you put the cigarette in your mouth, no one would put a second cigarette in their mouth.
I smoked for half my life. When I went to quit, my doctor said, "You need a better reason than not getting cancer in thirty years, because next week when you crave a cigarette, not getting cancer in thirty years won't keep you warm at night." And what I actually did was I realized that I was spending two laptops a year on cigarettes. And so I just said I'll buy myself a laptop every year from now on if I give up smoking, and I did. And that kind of helped.
But we need to help people understand— The problem that I find is that we tend to attack people on the upside. We say, "Oh, Apple's integration isn't as good as you think it is." Or, "Facebook isn't as entertaining as you feel like it is." And in fact, both of those, they are. I think what we need to convey is all the ways in which it fails that are not immediately obvious at the outset. And that's a hard problem.
Moderator: Here we have another question from the mighty Internets. Can you please say something about the difference between Europe and the USA, and if there is something of a feedback loop in driving each other in the wrong direction?
Doctorow: So the question is the feedback loop between America and Europe, or the USA and Europe, and what direction they go. And I mean, obviously there is this transatlantic table tennis in terms of copyright that we've had before where you get term extension, an extension of the length of copyright in America and then Europe has to "harmonize" with America by extending its copyright even longer than America's. And then America has to "harmonize" with Europe to make its copyright longer than Europe's, and so on.
But increasingly, the way that that kind of stuff happens is in these really secretive and sinister treaty negotiations like ACTA and the Trans-Pacific Partnership, where you have treaties negotiated by European and American representatives but without any representatives from the governments. They're done by their administrations and without any transparency into it, and then it's presented as a fait accompli. And in that way I think that there's not much difference, because they're both getting shitty laws in the same way now. They're they're coming out of the same source.
And they're not really American companies. I mean, EMI is technically kind of a "British" company. And Bertelsmann is technically a German company, and Holtzbrinck is, too. And so on and so forth. Sony is technically Japanese, right.
Audience 5: So you had quite a few questions. I'm your rambling guy for tonight. Short. I really like the analogy of the five-year plan, and the next five-year plan, and the next five-year plan. Because I come from Poland and we know how the five-year plans in Poland went. So. I have this feeling that those five-year plans will probably end up just like they ended up in Poland. What do you think about that?
Doctorow: Yeah. And that's probably an extension of what this gentleman over here said when I finished a little out of turn, which was if this has the trajectory of the drug war, where the hell are we going to end up, because obviously the drug war has been a disaster, continues to be a disaster, and shows no sign of receding from its disastrousness.
And the difference is that although I guess there are people who who would argue that drugs deliver a certain bit of the solution to their own problem— In other words, once you've taken the right drugs you no longer see the problem with drugs. And I don't mean that sarcastically. All due respect to my friends on the drug legalization side, and the idea of cognitive liberty and the rest of it.
But it's not the same way that computers hold the key to unlocking computers. Because what computers do, what networks do, is they make it cheaper and easy for us to do things together. So you know, in contrast to, and again with respect to my friend Evgeny Morozov, in contrast to things that he says, I'm much more optimistic about what computers can do for justice struggles. Because by definition, people in charge have already figured out how to coordinate their actions, right. That's how they got to be in charge. So giving them technology to make them better coordinated is a small incremental improvement, whereas people who are oppressed by definition have no capacity to steer the state and work collectively. Adding the capacity to work collectively to people who are at the bottom is a phase change for them in a way that it's not— It's a difference in kind and not just a difference in degree.
And so I think the computers and networks allow us to do stuff together that we could never have done before. And the more computers and networks we get, the more things we can do together with them. You know, there was this kind of tedious thing that happened about six or eight months ago. Whenever I mention the word "anonymous" in public, I would say, "Anonymous is a group that—"
And someone would come along and go, "They're not a group."
And I'd say, "Anonymous last week did—"
And they'd say, "Anonymous never does anything."
And I'd say, "People using the name 'Anonymous' did—"
"Well, they didn't all call themselves 'Anonymous.' Some of them called themselves LulzSec. Some of them called themselves AntiSec…"
And like, on and on and on. And for a while I thought it was just a sort of tedious word game, you know. It's free/libre/open source. This kind of endless kind of correctspeak.
But then I realized that it was actually because Anonymous and many other new kinds of institutions that we've seen in the last year are novel. That we don't actually have a vocabulary. There's something new on this earth, this kind of affinity organization that doesn't have the same hierarchical structures even if there are pockets of leadership, the way that there are with AnonOps or with bits of Occupy being spokespeople or coordinators. It's not anything like what these institutions would have been ten or twenty years ago. You couldn't have had anything on the scale of Occupy. You know, simultaneous coordinated actions in cities all over the world. You couldn't possibly have had that without a big sort of military-style command and control organization prior to the network, prior to to Internet. And so we lack a vocabulary to describe them. We lack a vocabulary even to think about them, in some ways.
So we say, "Oh, Occupy doesn't have a set of unitary goals. They must not be serious." What's interesting is that prior to this, assembling a big organization without first agreeing on your goals was cosmically insane because you'd you'd put all this energy into organizing and then it would turn out that you weren't all there for the same reason; you'd have to all go home again.
And now what we can do is we can all get together and figure out the stuff that we agree on. You know, our minimum common agreement, our TCP/IP of protest. And then we can work on that stuff. And then when we come to some stuff that we don't agree on, we can all go off and have a different Occupy over there to do that stuff, because the organizing itself has been cheap. It's no longer the case that the job of an activist is 98% stuffing envelopes and 2% figuring out what to put in them. Now we get the envelope stuffing for free, and we get to spend 100% of our time figuring out how to do stuff together. And so I do think that there is hope, because the terrain is not the same as the terrain in the War on Drugs.
Moderator: Okay, we've got like ten minutes here. So I'm going to ask Cory. Cory we've got like ten minutes. I think that's like five questions.
Doctorow: So why don't we take like three questions, then I'll answer as many as I can.
Audience 6: Okay, so I think it might become harder to influence the minds in government in the right direction because we are sending mixed signals. On the one hand we're talking about how they should chill and lay off the DRM and then you know copyright war. While on the other hand we are creating technology like Bitcoin or similar stuff that… We don't beat at the money that the lobbyists get. We are trading things that are competing with the Reichsbank and the Fed. And now you're getting the government really pissed. I mean, I'm a big proponent of that— I mean, not not the Fed but Bitcoin or stuff like that. So yeah, what you think about that?
Doctorow: Right. Two more questions.
Audience 7: Okay. Question was general-purpose computers versus appliances. You said the threat is mostly by the lawmakers. One guy before went in this direction I'm going. The problem I'm seeing is that we already do have these appliances. Even in this audience, people own iPhones and iPads. We have Kindles. At home we have PlayStations and Xboxes. The list goes on and on and on. So even we, we want this stuff because it has good features. You try to answer in this direction. I was curious if you have anything more to say. I think this is more of the problem.
Doctorow: Okay, one last question and then I'll try and answer them all in one go.
Audience 8: Okay, do you think a movement will develop to change the moral perception of computing like DARE, to make us feel bad about computing? And do you think hypocrisy developed among powerful people, to say, "Well, when I was young I programmed a computer but I didn't inhale?"
Doctorow: Alright. So the first two questions, the question about Bitcoin, and the question about why we continue to use these devices even though they contain our own chains, in some ways are the same question, right. Because the reason that we use these devices is because they work, and we're pretty sanguine about their failure modes because we're technocrats. We're like, "Oh, well. If they lock down my iPad I know how to jailbreak it. If I ever want to run some code on it that someone else won't let me, I know how to do it. I've got a runtime on my computer. I can take my apps off of here and run them there," and so on. So we're very sanguine about it. And so we tend not to worry that the consequences are going to come up and bite us in the ass.
But that's not all that different from lots of other ways in which people tend to overestimate their immortality. I mean, I felt that way. You know, I bought DRM media once. I once bought thousands of dollars worth of Audible books over the period of several years, and it wasn't until I switched to an operating system that Audible didn't have a player for that I realized that I was going to have to spend a month running two PowerBooks all the time through an audio capture app to get all those audio books out of the proprietary wrapper and into something I could play back. And then I went, "Oh my god, I'm an idiot."
But I was also a smoker, right? And I also sometimes forget to put on my seatbelt. I mean there's lots of ways in which we do this, and if we have to be pure in order to fight, then we'll probably never succeed. I mean, we have to admit that we live in the world and we sometimes either make mistakes or misjudge consequences or just do the thing that's most convenient, and that's how life goes.
The question about whether or not it's wise to piss them off while we're fighting them… You know, I don't think they could be any more afraid of the Internet. I don't think that the thing that they most fear is Bitcoin. I think the thing that they most fear is all the disruption that arises from the Internet.
And Bitcoin's not the major disruptive application of the Internet of the last several years. It's things like Amazon, and its things like automated high-speed trading and so on. All these things that are kind of legitimate Fortune 100 halls of power stuff that has been most disruptive and that has got people running around like headless chickens. Bitcoin is just a thing over on the side. The number people in the halls of power who understand it is minuscule, and the number who take it seriously is a fraction of them. And that's even true of cryptographers, not just people in the halls of power.
And then the last question was will there be a DARE to stop using computers movement in the future? I think we've already got it now, don't we? I write young adult novels sometimes as well as novels for adults. And I did the young adult breakfast at Book Expo America, which is a big book expo show in America where all the book sellers come. They have all the young adult booksellers show up, and they have a celebrity come and introduce the three young adult writers who are there.
And our celebrity that year was the Duchess of York, Sarah Ferguson. And I didn't know this, but Sarah Ferguson writes novels about how the whole world sucks and you need to shelter your children from them. You know, strangers are bad, video games are bad, and so on and so forth. So she was introduced around to us, and she said, "What's your book about, Mr. Doctorow?" And I said well, it's a book about kids who play video games and use it to win their freedom.
And she said, "Oh. You like video games, then."
And I was like, "Yeah, my wife used to play Quake for England. I love video games!" I mean, there's already lots of people who think computers are terrible and bad for us and will destroy our lives. I don't think we have to wait for the future for that. I think it's here.
Anyway, thank you all very much.