Aengus Anderson: Welcome back to The Conversation. I’m Aengus Anderson.
Neil Prendergast: And I’m Neil Prendergast.
Anderson: And Micah Saul has recused himself from this discussion because we are going to be talking about online activism, individual liberties, and lots of stuff that he actually knows far more about than either of us but he also works in that field for Big Brother. So, Micah didn’t really want to get involved. He thought we would really be the good ones to have this conversation because we can say whatever we want.
Prendergast: The advantages of being Little Brother.
Anderson: The advantages of being Little Brother. The advantages of being underemployed.
So, that may be enough of a teaser. Today we’re talking to Rainey Reitman. She’s the Activism Director at the Electronic Frontier Foundation. Which means that she spends her time trying to make a host of digital and civil liberties issues things that you understand and care about. We’re talking about things like Internet regulation, government surveillance, and privatized space. And in terms of actually tangibly what Rainey is doing, she and her colleagues at the Electronic Frontier Foundation are regularly filing lawsuits. They’re organizing online activism. They’re developing software tools. And they’re raising public awareness of digital liberties issues. Rainey’s also the cofounder of the Freedom of the Press Foundation and sits on the boards of numerous other groups dedicated to maintaining open information.
Prendergast: Right. And so I would put her into this camp of the doers. And so often in our discussions, after the interviews we say okay but, what would this look like if this great idea was deployed on the ground? And I think you’re going to hear a lot about that in this interview.
Rainey Reitman: The Electronic Frontier Foundation and my work is focused on civil liberties. And we’re actually dealing with how constitutional rights—so these ideas that we have you know, the First Amendment, the Fourth Amendment, about what rights we have to speak and read and have privacy, how those are translating into digital spaces. As we’ve moved into increasingly digital spaces, so online worlds, we’re moving away from your traditional physical spaces where you have public streets; where you have public squares; where people can go to protest, and into areas, if you would call them that, that are entirely controlled by corporations. Whether it’s your ISP that is connecting you to the Internet, or it’s your third-party intermediaries like Facebook, which is making policies around what kind of speech they’ll tolerate, what kind of privacy options they’ll have. Whether it’s somebody like Google deciding whether or not certain things will be allowed in search results. And so instead of the world that we knew previously, and we still have, which had a lot of these traditional public spaces, we’re seeing how our world is translating into a place where your rights are getting dictated by terms of service that nobody’s reading.
Anderson: And I mean, this is going to sound so naïve but like, this is something I hadn’t thought about before, but that most of our law is based on physical space. The Internet we do think of with spacial metaphors…
Reitman: We do.
Anderson: But it exists and it was the creation of an era that is highly corporatized, whereas probably the laws that govern public space were created in the 18th century.
Reitman: Right, exactly. You know, I think about this in particular with free speech issues. You know, we have as a society really upheld First Amendment values about tolerating speech even if we completely disagree with it. Even if we think that it is abhorrent speech that we really wish nobody would hold those views. Still we’re going to allow that speech to take place because we as a society have decided, have founded our country on this idea, that we want to have free expression for everyone. We want people to be able to speak their minds, and that we as a society are benefiting from that level of tolerance.
But when you move into an online world, you have places, if you would call them that—web sites—which might for example put up a policy that says, “Well, we’re not going to allow certain types of hate speech. We’re not going to allow certain types of conduct which would offend many of our users.” So for example we see Facebook taking down the groups for women who engage in breastfeeding. Or they might get pressure from other groups to take down other types of content, whether it’s that they don’t want to allow Nazi sympathizers to be on Facebook. Or whether they’ll get pressure from one political group or another to remove certain types of speech. And it can become a PR nightmare for Facebook to attempt to stand up for those less-popular forms of speech, and so it’s easier for them to simply take it down.
That’s not something specific about Facebook. I don’t want to pick on them, they just happen to be one of the biggest online social networks—they are the biggest online social network out there. It’s something we’re going to see in every single web site that we bump into, where they are setting up terms of service that decide what kind of speech is okay. What does that do to our society if all of a sudden instead of having this broad tolerance for free expression, for speech that we disagree with, we have companies who are incentivized to take down speech that’s unpopular? And I think that is a very dangerous situation for us to get into.
Anderson: And it seems really unprecedented in a lot of ways, in that we have this parallel world that we all exist in now—at least if you’re online, you have to exist in privately-owned spaces. There is no public infrastructure.
Reitman: Right, yeah. No, we don’t have anything like a public square or public streets in the world of the Internet, if you would call it that. Another example that I’ve been really thinking about a lot for the last couple years is financial transactions in the Internet world. And I think a very good example of this would be around WikiLeaks and how right after WikiLeaks started publishing the State Department cables we saw them get shut down by PayPal, by Visa, by MasterCard. And I think it was a wake-up call for a lot of people that in fact it’s very hard to exist on the Internet as an entity that is donation-driven if all the major payment providers decide without a court order, without an entity even getting accused of a crime, that they’re not going to allow payments to be processed for them.
Anderson: That’s fascinating, right, because the physical space analog would be walked down to the office, you send in your check through public mail…
Reitman: Right. Or you give them cash.
Anderson: Right.
Reitman: I mean, at the end of the day, I think one of the—
Anderson: There’s a physical thing.
Reitman: Well, we’re working towards a world where more and more transactions are online. Online transactions don’t allow you to pay with cash, they just don’t. I mean, we could start talking about Bitcoin and that’s a fascinating issue, but right now we are seeing that payment intermediaries like PayPal and Visa and MasterCard, they’re having huge influence over what kind of speech can survive on the Internet because it can get money in the door. And that means that these companies are setting up policies and then arbitrarily enforcing them where they see fit. And the effect is that certain types of speech will wither and die on the Internet because it can’t get money in the door.
Anderson: What are the ramifications if this keeps playing out? It’s easy to brainstorm a bunch of stuff related to free speech. What else happens?
Reitman: That’s interesting. Let me think what else happens. It’s so easy for me to think of what happens to free speech. I think another implication…and I don’t know if this will happen. It’s something I’m certainly concerned about as we move into more and more of our transactions in life going onto the Internet: the constantly-identifiable person online.
You know, if we look at the early days of the Internet, it used to be that you could go and you could join various groups and you could have an identity, you could try on an identity. You could go to a different group, you could try on a different identity. Increasingly we’re seeing persistent identifiers sort of lingering as you move from web site to web site. For example, any news article that you might want to comment on, you’re probably signing in with your Twitter account, with your Facebook account. Those things especially on Facebook, for example, are directly linked to your real name. What you do and say online becomes directly associated with who you are in the physical world. And I think that’s an interesting issue. As much as we try to defend free speech, I think we must also defend and tolerate anonymous speech. There are points of view and issues which for a whole slew of reasons we might not want to have tied to our real identities.
Anderson: Though I always think with anonymous speech you always get that sort of statement that when it’s tied to who you actually are, people behave better, right. Isn’t that kinda the counterargument to that?
Reitman: Right. So is that what we want? I mean, at the end of the day is what we ultimately want is a society where people are just on their best behavior all the time? And I don’t know if that’s exactly what we’re trying to achieve here. I mean, I think that’s an interesting counterargument. I’m not sure it’s true. But to the extent it is true, ultimately I’m not sure that we want to be creating policies about what kind of speech is tolerated simply because it’s going to be the least offensive and the least in-your-face and the least radical.
I often think back towards the early gay rights movement, or for example the gay rights movement right now in Russia. It was, and there it currently is very difficult to speak out and say you know, “I think that we should have equal rights.” And I wonder if for example in the future it was very hard to speak out online without having your identity tied to your real-name identity, would people be as bold, as willing to fight for something they believe in, as willing to experiment with something they were trying to figure out what their views were, if they knew it was going to be tied to them forever? That’s the other thing about the Internet, is it doesn’t forget things.
Anderson: It doesn’t forget, right. And I was thinking of the quote, and I can’t remember who said it right now but, “Consistency is the hobgoblin of small minds.” And the notion that if you are expected to be consistent throughout your life, in a way that sort of demands that you not grow.
Reitman: Right.
Anderson: And because I think consistency is often seen as a virtue—
Reitman: The world changes.
Anderson: The world changes and the mind is inconsistent and we form bad ideas often and have to revise them… So it seems like that—
Reitman: And society has changed. I mean, I look at you know, America when my grandmother was a child and America now, and it’s wildly different. And I think that that’s a good thing, to a large extent. I want to create a world online where individuals have the ability to change over time and not constantly be stuck to the dark shadow of their early Internet experiences.
Anderson: I think for a lot of people they’d go, “Well you know, corporations created these infrastructures. We voluntarily use them. We can choose other corporate ones. It’s never going to get me, there’s still this real world out here. I don’t need that.” Of course that seems like a fallacy but I’d like to address that a little more.
Reitman: Well, I mean, we’re kind of talking about the future, right? It used to be that talking about digital civil liberties issues was a little bit more fringe. This is a mainstream issue now, right. Technology has bled into everything we do. Most of us are carrying a little cell phone everywhere we go. A little cell phone that is how we connect with our friends, that’s tracking our location, that’s connecting us constantly to the world, that allows us to take pictures which have metadata in them which we post to Twitter.
And then our financial transactions are increasingly moving away from these cash-based, much harder to track systems to these digital-based credit card-based, extremely trackable systems. We’re moving into a world of smart houses that can actually see what time you turn your television on every day. What time you’re turning it off every day. Is there a particular day when you particularly left it on a little bit longer? When are you using your microwave? And it’s all a computerized system that’s actually integrated. And for our benefit is the argument with the technologies we use every single day. And cars are getting this way, too.
Anderson: So, we get to the word that— Or the two words that you had been just thrilled that I used at some point, “big data,” right.
Reitman: Yes. Big data.
Anderson: We are creating these enormous trails everywhere.
Reitman: Right, that’s completely correct.
Anderson: And they’re moored to what we do in the physical world.
Reitman: Right. Our physical—
Anderson: But they’re controlled in the private world?
Reitman: Well, so here’s what’s up. I don’t know if I think of the physical world as the physical world anymore. The physical world and the digital world are becoming increasingly intertwined, and it’s going to happen in extraordinary ways that you and I probably can’t even predict, but we can try right now. Which is going to be that everything in the physical world is going to have this digital trail attached to it.
You know, one of the things they’ve talked about are sort of your your responsive advertisements on the street. You know, would we be willing to walk by a billboard that could see, back at us, how old we are, what our gender was, and show us an appropriate ad? Would we want that kind of a billboard, for example, to do facial recognition on us and match it to Facebook profiles of people in this area? And what would that be like?
I mean, again, I’m sort of doing sort of large-scale like, long-term…what could it be like to be in the physical world years from now. But it’s not that the technology isn’t there. Some of that technology already exists. The question is you know, how’s it going to get implemented? And the truth is everything we do is going to be creating this long data trail about us, and then that’s a data trail that’s interesting for companies that are interested in doing advertising. It’s also interesting to the government that wants to…I don’t know, tracked down people evading their taxes, or go after terrorism or what have you. And increasingly I think that individuals want access to that kind of data.
Anderson: And that’s where it gets really interesting. So it feels like we have this old physical realm, which has a lot of public space built into it.
Reitman: Mm hm.
Anderson: We create a new digital realm which is entirely divided up amongst private holders.
Reitman: Mm hm.
Anderson: The digital realm moves into the physical realm.
Reitman: Right.
Anderson: So it’s basically a blanket over the old public physical realm.
Reitman: Right.
Anderson: You’re in a public street, but your car is sending back private data through private airways.
Reitman: That’s right. And then the other thing about that is data storage is cheap. It’s not expensive. It’s gonna get cheaper and cheaper and cheaper. Tracking mechanisms are going to get more sensitive, more sophisticated. And we’re going to get to this place where…and we are…kinda pushing into it right now, where the default is to remember everything.
Anderson: I mean, it feels like we’re so young in the way we’re going into this. And our expectations are so childish. We go, “I want to remember everything!”—
Reitman: Right.
Anderson: —without the wisdom to say, “Well, the brain forgets in a lot of ways so can filter data.”
Reitman: Right.
Anderson: Or so it can forgive. Total memory can be such a curse. Are we learning that, or do we need to learn that in the digital realm?
Reitman: Well certainly there are historical things that’ve happened that it would benefit our society to remember. But the ability to change and adapt, the ability to try new things, the ability to forgive, and forgive ourselves… Those are things that I think a little bit of forgetfulness can be helpful with.
The other thing is that…and this is the big elephant in the room: what about privacy? Don’t people have a need to be able to make certain things private? More and more social networks and online spaces are making things public by default, and I think as a society we’re starting to tolerate the public-by-default concept. Is this something that we the consumer (if you would call it that) want? Or is this something that those corporate entities, which are in fact the ones that are creating and administering these online services, get a lot of benefits out of and so they’re pushing us there as much as they can? And so public, recorded, never deleted.
Anderson: Do you think there’s a bias in technology itself that leads towards the creation of data and the centralization of control? By always…
Reitman: Yes. [laughs]
Anderson: Oh, that’s fascinating. I was not expecting you to say that.
Reitman: So you were saying a bias in technology itself that leads the centralization of control and the collection of data?
Anderson: Right, because it seems like a lot of what technology does is it gives us the ability to do more things with fewer people.
Reitman: Right.
Anderson: And so in a way like, it can expand control. And of course—
Reitman: Right.
Anderson: —the counterargument is always like, the Internet is this great democratic force…
Reitman: Kind of. So, I also think technology’s changing, the Internet’s changing. If you look back historically the Internet was something where like, you would order your computer, you’d put it together yourself. Free software advocates are extremely passionate about this, about the ability to…to tinker. For example one of the issues we deal with a lot is sort of how individuals with for example disabilities want to be able to hack their phones a little bit so that it works for them.
But, as more and more people are using technology, we’ve started to prioritize making it easy? And in fact, companies have—and Apple is the obvious example of this—have prioritized making it very difficult to hack around, with the idea being that the company knows best. Don’t try to tweak it. Don’t try to make it your own. We’ll ship the best possible product and you just buy that product; it does this thing that we told you it would do.
That shift, from hacking to receiving, is a shift that lends itself to centralized control. So, that is an issue that is going to be something we’re already grappling with, quite frankly, in the technology space. And then as far as the collection of data, yes, a lot of the computer technologies we’re dealing with collects data. And I, I think that there’s a couple of issues to flesh out in there, but one thing I think would be important is enough transparency so that an individual knows what a company has on them.
Anderson: Right. Yeah.
Reitman: And we’re seen this right now with the National Security Agency, right. We’re seeing increasingly, people are up in arms because…not to switch from private companies to the surveillance issues, but they’re connected in a lot of ways.
Anderson: Oh yeah. And we can switch to them. I mean, that was something that I talked to James Bamford about a lot. It connects to kind of where I wanted to go next, where does all of this lead?
Reitman: Right.
Anderson: And Bamford, who was looking specifically at the government, basically said “police state.” If you want to look at the worst case scenario, and that’s not inevitable, but like, it becomes a lot easier if you’re in power to surveil people, you know. It’s something you can automate now in new ways. And he talked about, with the Stasi you’ve got to have one guy, one set of earphones, one target.
Reitman: Right.
Anderson: And we’re way beyond that.
Reitman: No, he’s completely right, we are way beyond that. We have tools now available to the governments for surveillance that we could never have even dreamed of fifty, a hundred years ago. Most recently leaks seem to indicate that the government is getting truly mind-boggling amounts of data about our Internet usage, and also tracking phone records of millions of phone users, so. I don’t know that I like to use the word “police state.” And the reason I don’t is because I think that people don’t respond well to it? [laughs]
Anderson: Well, I certainly— I mean, it’s easy to write off, right?
Reitman: Right.
Anderson: Because you think, “Oh, Orwell. We’re not— I mean, it’s sunny San Francisco outside. We’re not dealing with that.”
Reitman: Right. And it’s a beautiful day and you know, I’m using my Fitbit to track my exercise, and I’m checking my Google calendar, and it’s sending an automatic update to my phone which gives me location directions to everywhere. So what if the NSA has a backdoor to these sorts of things. I’m sure they’re not interested in me.
That’s what I hear sometimes. So what I trying to say is that look, at the end of the day, we founded our country with certain beliefs about when the government could and couldn’t get access to our communications, to our private records. You needed to have probable cause and a judge needed to sign off on it. Well, technology has moved so quickly, and we have been so reactive to some of the tragedies in our history as a country that we’ve put in place secret courts that are authorizing dragnet surveillance. And I think it’s something that now that it’s been made public we really need to have an honest conversation about whether it’s something we’re comfortable with. I’m not comfortable with it. And I think if we don’t stop it right now it’s going to get a lot worse.
Anderson: And it seems like there’s a big undercurrent here. You were talking about how much our society has changed since your grandmother’s time.
Reitman: Mm hm.
a I’ve talked to a lot of people in this project about civil rights issues.
Reitman: Mm hm.
Anderson: And, many of those things were illegal.
Reitman: Right.
Anderson: If you look at the civil rights movement, it was a lot of African American people being illegal.
Reitman: Right.
Anderson: Under the law they were…wrong.
Reitman: Right.
Anderson: As we talk about this stuff, it seems like a big part of the free speech that we were talking about early in our conversation all ties back to this notion that change that is morally good can be illegal.
Reitman: I think that’s right. And I think this ties back to what you were saying about forgetting earlier. It is…difficult in the moment to grab a protest of thousands of people and arrest them all and hold them accountable and charge with trespassing, and then take them to court one at a time. I mean, that’s a pain, right. And doing it in real-time would be kind of…bit of a media splash.
Well, it’s easy to take a photograph of a protest. And the technology that we have now can actually hone in on each and every person. And as facial recognition technology increases, it will be easier and easier to identify every person and to keep that information on file, whether it’s now or six months from now or two years from now. And I wonder not only the extent to which individuals who are attending those kinds of events would face repercussions that they wouldn’t have faced historically— I mean historically, we have held a certain amount of tolerance for civil disobedience, a willingness to allow people to push hard to change the law, including by trespassing on the law as part of a protest mentality, and then have a certain amount of leniency to them.
Well, as we move into increasingly digital worlds, are we going to have that leniency? And, do we have any possibility of ever forgetting? That doesn’t just affect the people who attend, for example, a protest. It effects all the people who decide not to go to a protest.
Anderson: That’s what I was thinking, it’s got a chilling effect, right?
Reitman: It has a huge chilling effect. It’s the knowledge that any time you go somewhere, to a public place or what have you, you’ve got this record of it. And I think there can be real societal harms to having constantly trackable and traceable technology infiltrating everything we do.
Anderson: A lot of the proponents of tracking, I mean there’s obviously the side that well, it can make your life more convenient. But I think the other part of it you alluded to earlier in regards to terrorism, is that a lot of this stuff makes you safer. It makes people behave better. It’s the panopticon thing, right?
Reitman: Right.
Anderson: When you know you’re being watched, you don’t do anything wrong.
Reitman: Right. Well, I think that there’s different studies as to the efficacy of the panopticon. Before we start selling our privacy off we should know at what cost. At 0.000000002% increase in security for all of our privacy? I mean, that doesn’t seem like a very good deal to me. And so I think there’s efficacy questions…
And then, I remember I was talking to someone about it there was a study, and I think there’s questions about the study, but it showed that there were a lot of people in America that were okay with trading some of their privacy for a little more security. And then the response was it doesn’t matter because it’s absolutely unconstitutional. You know, I mean if I talk about what I want to move forward, I would like constitutional rights that we founded this country on to come with us into the digital age. Even as we’re really moving incredibly fast into digital worlds and changing things, I don’t think that those basic values should just get left by the wayside. I think we should bring them with us.
Anderson: Why are those constitutional values good? Why not move into like a very comfortable surveillance state?
Reitman: [laughs] Why not move into a comfortable survei— You know, that’s…that’s an interesting question. And one I haven’t been asked that often, I’ll be totally frank with you.
Anderson: Good. [Reitman laughs] You know, someone’s got to go to bat for the dictatorship.
Reitman: Yeah, sure.
Anderson: Let it be me.
Reitman: I think that there’s a couple of things. I mean, we’ve seen historically the incredible price that societies have paid for surveillance states. I mean we saw it with the Stasi. I remember I was actually in Germany and went to the Stasi museum. I saw all of these different surveillance devices they used against individuals and I actually got to see pictures of people protesting and going into the Stasi headquarters and finding their own records. And I got to see their faces. And just the shock and the pain that was just so apparent. That everyday individuals were deeply horrified by what was going on.
We’ve seen that in Egypt where we we saw people take over state security headquarters and go in and find the government had been tracking their Internet communications in ways that horrified them. And I actually got to see pictures from there and I was struck by the similarities. How that hurt and shock, almost a growing numbness of not being able to handle it was just so clear on the faces of the people.
It’s clear that when people realize it affects them, it hurts them on a gut level. To be like, “This is my life. This is my world. This is the neighbors that I talk to. These are the places I’ve gone…” That’s nobody’s business, period.
Anderson: Right. So we can kinda intellectualize this privacy conversation all day long, but when you feel it and when you see that people have basically pried through your life—
Reitman: Right, I think that a problem we’re facing with defending rights on the Internet is that too often people don’t realize it affects them until it is them. And so what we’ve got to do is find ways to talk to people about these issues so that they understand that it’s not an abstract problem. It’s a problem that affects them on a very personal level because they are Internet users.
Anderson: How much of it also do you think ties into understandings of big systems, right? So we can go about our lives and like, most people are just busy working. And they go to work and they go home and they deal with mortgage payments and lots of other things; kids and life, and it’s busy.
The stuff you guys deal with here is really kind of air war stuff. We’re talking about law, and lobbying, and things that’re happening far away—
Reitman: It’s hard to personalize it, yeah. Well, I think it’s about having to understand big systems, but I would also say that especially when it comes to surveillance issues, the government has spent a lot of time being vague about the details, period. And avoiding making those public. And they have resisted answering our Freedom Of Information Act requests on these things. And they have fought us in court to not be transparent about the degree of domestic surveillance that’s going on.
It has taken revelations from whistleblowers to make a lot of this information public. I mean, as hard as it is for everyday individuals to wrap their heads around big systems, it’s much much harder if the government has gone on a concerted campaign to hide the information from people.
Anderson: I’m thinking of this conversation I had the other day with an economic blogger named Charles Hugh Smith. We ended up talking about does abundance, economic abundance, or at least relative wealth make you not too concerned about anything that might require sacrifice now. If your life is generally pretty good, it doesn’t feel like you’re being spied upon, you’re relatively wealthy, you kind of don’t imagine needing to use the First Amendment rights the same way…a black person would in 1964.
Reitman: Right. That it’s hard to move us out of our comfort zone. I think that’s a really interesting point. I see that in particular with popular technologies. So for example you know, nine of the well-known Internet companies were implicated in NSA spying recently. But, just seeing the name “Google” or “Facebook” or whatever on one of these leaked documents, is that actually going to incentivize people to abandon their GMail accounts and their Facebook accounts? Or have they gotten so used to it, so comfortable in those spaces, frankly enjoying the communities that they may have there, that it’s hard to show them anything that would say, “Okay, now don’t use GMail anymore?”
Anderson: I think you made a really interesting point talking about their online communities. In a way, socially…if anyone listening has tried to opt out of Facebook they probably know that it can be really difficult in that there are a lot of invites—
Reitman: Right.
Anderson: You know, all of your friends who are far away, you kind of lose track of them. Like, there are legitimate uses for these things which make it… you pay a social price at this point for opting out because so many have opted in.
Reitman: Oh, absolutely. Absolutely. I think that I wouldn’t want to leave my online communities. I’m sure you wouldn’t, either. Probably people listening to this podcast wouldn’t either.
Anderson: Totally.
Reitman: But then there’s a real questions about okay, so if we don’t want to leave our online communities, and those communities are entirely controlled by corporations, and those corporations aren’t receiving money from us most the time—they’re actually selling our data to advertisers or selling access to it or selling facetime with it, what way do we have even to push back and advocate for our rights? That’s a hard question.
Anderson: Especially when there’s maybe a harder question which is, from your standpoint here… I told you about the other people in this project. How do you get them on board realizing that this issue is part of their issue?
Reitman: So that’s an interesting question. I don’t know if you followed the SOPA fights last year but let me recap it real quick. So there was this bill, the Stop Online Piracy Act. And it was a bill that was supposed to be promoted to cut down on Internet piracy—so sharing unlawfully. But the bill was written in a way that would have allowed whole web sites to be taken down. And it became very clear that the bill was actually dangerous to the future of the Internet. It was a blacklist bill that would silence whole portals of speech.
And I was so impressed to see people banding together from all walks of life, every side of the political spectrum. I don’t mean just left and right, I mean all sorts of sides I didn’t even know existed. People who had never worked on Internet issues before, jumping up and campaigning hard for this. All of these different communities, they get the Internet. They get how important the Internet is to their future, and they get how much it brings to society. And when you show to them we’re talking about the future of the Internet and the future of our ability to communicate with one another through the Internet, all of a sudden it’s real for them.
Anderson: This seems like an issue that could unify people from a lot of different political stripes because it doesn’t matter what you believe, at some point you may be in the position of needing to advocate something unpopular. Whether you’re a gun owner, or whether your someone who’s pushing a really radical environmental bill… Do you think that this is an issue that maybe you could get a broad popular consensus on?
Reitman: I do think that it’s possible to galvanize people to understand how it affects them regardless of their political background. But I think that we are in our strongest position for advocating for rights when companies who exist and thrive because of the Internet understand the long-term value of defending the Internet and making a place that tolerates free speech and privacy, and join everyday citizens in fighting for that. We saw that with SOPA, for example. Too often, and I think this is really unfortunate, you see Internet startups that’re really worried about their startup business, period. They don’t have time to worry about anything else.
But in fact, they need to be in Washington lobbying for smart technology policies. Because the policies that are adopted today are the policies we’re going to have ten, fifteen, twenty years from now. They’re going to shape the policies we have fifty, a hundred years from now. We really have to try to get tech policy in place that tolerates both constitutional values and societal values that we founded our country on, and innovation. Period. It’s got to have both or we’re not going to be able to grow.
One thing we can push for is to create norms on the Internet. You know, you think about well what can we do to sort of safeguard the future of the Internet? And if we can create norms that are speech-tolerant, that are tolerant of privacy, that are tolerant of innovation—
Anderson: When we get into the norm is that a moral question? Is it saying sort of that freedom of speech is a good? Maybe part of the big conversation we need to be having about digital liberties involves bringing that back to the front, and saying that here’s kind of arational thing, our starting point that we need to talk about?
Reitman: Maybe. Maybe that’s right. I do think that free speech is an absolute good. I think that we can also come up with a lot of justifications for it. But I don’t know that we should spend a lot of time having to do that, when there’s so many clear threats to it on the horizon.
You know, our founding fathers kind of set out those constitutional rights for a reason. And it was because they had learned historically the consequence of not having them in place. We wanted to make the US different, and I think we’ve seen those same values be reflected in many of the international agreements around human rights that’ve…you know, it’s not just a United States thing.
So really, I think that we have come over hundreds of years to start to understand that there’s great great value in upholding free expression and freedom of religion. And I’d like to try to find ways to ensure that that continues in the digital world.
Anderson: Is this an issue of going out and persuading people? Or is this one of these things where people won’t really change until maybe the loss of freedom of speech really hits them in an economic way, and that’s when they fix freedom of speech? Do think conversation matters?
Reitman: I think it does. Just to kind of encapsulate EFF’s theory of change, we fight for civil liberties on three battlegrounds. And the first is in the courtrooms. We try to create strong legal precedent, using the court systems, that will uphold constitutional rights in digital spaces even if they’re unpopular. Even if the average citizen would say, “Well, I don’t really care about that kind of free speech. It’s not something I agree with and I don’t really want it to exist.” Setting that aside, we’re using the court systems to fight for liberty, basically.
The second we do is we create technologies. We have a technology team that actually builds technologies that you can use online to for example safeguard your privacy. We have a tool called HTTPS Everywhere that you can install for free and will give you a little bit more security online.
And then what I do and what I think has made a huge effect is broad-scale advocacy. Teach the world about why this matters. Teach the world about how they need to get involved for fighting for the future of the Internet and the future of technology, and innovation and free speech and privacy, or we’re not going to have it later. Years ago that was more of a fringe issue; this is moving front and center. This is becoming something that has global scale, that’s involving tons of people on a level that I don’t think we could’ve imagined before. And I think it’s becoming something where everyday Internet users are picking up the phone by the thousands and demanding Congress don’t infringe on their free speech rights online.
Even though I don’t know that our politicians are going to enact thoughtful technology policy that’s going to guide us along the way for the next fifty years, I think that the Internet’s going to survive anyway, and that what’s going to happen is you’re going to have geeks and netizens coming together and pushing boundaries in innovative ways, and creating amazing technologies. I suppose I’m a bit of an optimist about the human spirit to triumph and route around, whether it’s censorship or surveillance issues. No, there’s not gonna be a ton of forethought going into it but yes, we’re going to get through it anyway.
Aengus Anderson: One of the big things that jumped out at me was analogies of space.
Neil Prendergast: Yeah, that’s right where I kind of began. I was listening…the first sort of ten minutes I was thinking to myself, “Well, this person’s talking about public space but it’s only digital.” And then I was like, “Oh yeah, that’s like a huge thing, though.” [laughs]
Anderson: There’s not only digital…
Prendergast: Yeah, there’s not only digital, there’s this giant thing and she’s bringing it to our attention.
Anderson: Which is great, for people who don’t work in that realm all the time? Like, it was eye-opening for me to hear her use the public space analogy in that way. People talk about “information superhighway.” That’s an analogy that’s so overused, so yeah we think of the Internet as a series of roads. And we think of roads as physical spaces. We do not always think of roads as public spaces. And we certainly don’t take that thinking and port it over to the digital realm where we’re like, oh. Digital roads are all private roads. They’re all toll roads. There’s no public road, at all, and there’s no public square, at all. And then you begin to think like…well what do those things do out here in the real world? Like what are they for here? Like why public space here?
Prendergast: And I think there’s this bedrock of history that she’s sort of resting upon. And you go back to the the labor movement and its deep origins in the 19th century, and it’s real strength going into the 20th century, and recognize how important public space was to that particular social movement. It would’ve been impossible for say, the International Workers of the World to really organize at all if they couldn’t speak in public. So often you hear people like Mother Jones, or Eugene Debs, really talking about free speech as much as they were talking about labor.
Anderson: But what if we don’t like Communists? What are some good movements that came out of public space? [both laugh]
Prendergast: Well. Okay. Well, most, right? I mean, the civil rights movement, claiming that look, there’s this way in which a restaurant or hotel or a bus is a public space. It might be privately owned but it’s a public accommodation, in the words of the the Civil Rights Act. Those activists in the 50s and the 60s made people realize that access to public space meant access to being a part of the community. Public space has always been a part of our political history. And it’s really interesting to me that you know, the Internet has kind of grown at a moment were…maybe others would agree with me on this, maybe not, but to me it seems that there’s been a retreat from public space in the last two or three decades.
Anderson: And you mean in the physical world.
Prendergast: Yeah, yeah I mean in the physical world. You know, the town I live in, we actually have this town square. And I mean it’s a square. Most towns have a main street, ours dead ends at the town square.
Anderson: That’s so charming.
Prendergast: Yeah, it is very charming. It’s where the Christmas tree goes. It’s a genuine town square. It was there in the original plat of the town, it’s been public for the last 150 years, the origins of the town are that old. But people don’t shop on Main Street so much around the square as much as they used to, as is the case I think in a lot of small towns. And instead you know, the real hopping retail space is out near the interstate at a place called Crossroads Commons…which is not a commons at all.
Anderson: It’s deliciously ironic.
Prendergast: Yeah, right, right. That’s just one example but I think others could point to even more, where some people live in a privatopia, where they almost never touch public space except for the rubber of their SUV on the interstate.
Anderson: Which is interesting when you think about okay, so in the physical world (if I buy the argument you’re making), we’ve been getting increasing amounts of private space that kind of masquerades as public space. Giant corporate areas, gated communities, that sort of thing. And concurrently, so much of our existence has moved into digital space, which is all private. Do you think the move towards private space in the physical world has kind of conditioned us to accept going into a digital world where we don’t have any public space?
Prendergast: Yeah. I mean, I think that that’s kind of a workable thesis. I once heard a quote, something along the lines of, “when change happens, people follow the ideas that are lying around.” I think Milton Friedman said that, actually.
Anderson: See, you can never predict what these nutjobs are gonna say. He probably said it in public space, perhaps. You know, it’s interesting to think that okay, so we’ve got this retreat from public space in the physical world, a move into a totally private space in the digital world, and what are the ramifications of that? And I think that’s what Rainey does a really good job of showing us. Public space matters for any kind of social change. And when you don’t have it, and when your life is increasingly digital, you are really in a way surrendering your flexibility when it comes to trying to make a better future, a more fair future. I mean, you just mentioned labor movements. You mentioned civil rights. You know, there are a whole variety of things that in a way happened in quasi-legal or illegal ways, in public space, that led to reforms that we see as just absolutely essential, if not morally obvious today. And it’s interesting to think of like, what is a world without digital public space, and do you get that sort of change?
Prendergast: Right. And what’s interesting to me is how optimistic she seems to be?—
Anderson: Uh huh.
Prendergast: —that change can happen, that some public space can kinda be created online? I thought that that was really amazing given that there’s really not that space there now.
Anderson: And I wonder, can you get positive change out of private spaces in the digital world? Are you always going to be in like, illegal private spaces? If you could create a public digital space, what would that yield that would be different? And I think that’s something that I kinda wish we’d gotten into a little bit more in the conversation. What would a public digital space do? What sort of rights battle would happen there? What sort of government transparency battle would happen there? Or do you even need it? Can you have that all on private space? I don’t know and I think that’s something that’s probably worth a lot of thought.
That was Rainey Reitman, interviewed on June 13th, 2013 in San Francisco, California.
Further Reference
This interview at the Conversation web site, with project notes, comments, and taxonomic organization specific to The Conversation.