Scout Sinclair Brody: We’ve got a full program, so I’m going to ask y’all to keep it brief, but you all are involved in one way or another in the creation of technology. So tell us a little bit about who you are and what you do in this space. Let’s start with Tyler.
Tyler Reinhard: Hi, everyone. My name’s Tyler Reinhard. I’m a designer and a publisher. I began that work in 2000. I made my first web site when I was in high school to sell underground radical zines online via PO Box. And ever since then I’ve been working on projects like that, with varying escalations. Right now I’m working on Signal, the privacy app for iOS.
Sinclair Brody: Get it.
Reinhard: Yeah, get it. It’s pretty cool.
Harlo Holmes: And/or RedPhone.
Reinhard: Yeah, and/or Redphone and TextSecure. Those three are a suite of tools [since combined as just Signal] that me and some other people work on with a project called Open Whisper Systems. I’m also the publisher of Mask Magazine, which is a low‐brow approach to discussing political issues for the Internet generation, I guess. So that’s me.
Sinclair Brody: Awesome. Harlo.
Holmes: Hey. I’m Harlo Holmes. I’m a digital security trainer for the Freedom of the Press Foundation based out of San Francisco but technically I’m the New York office, just me. I have a professional background in software development, primarily with The Guardian Project, which is a collective that builds a bunch of privacy‐forward mobile apps. And through The Guardian Project, I got to learn very very intimately about crypto systems. I’m not a cryptographer, I’m not a mathematician, but I’m really really good at APIs and really really good at using libraries, so I’ve come to a point where having gotten to intimately learn how these codebases work am able to explain to people, journalists, whistleblowers, librarians, high school students, you name it, how these pieces of code come to bear on the apps that they use. I’ve always been really really interested, even since I was a kid, in secrets and how they’re protected and how they’re stored and kept, even though mine are really really boring.
Sinclair Brody: Alright, Ame.
Ame Elliot: I’m Ame Elliot. I’m a user experience designer. I love mobile apps that make you happy and that feel good under your thumb.
Sinclair Brody: Wonderful. The previous panel talked a bit about privacy as something that we’re struggling for. There are, however, a number of technologies, we’ve mentioned a couple already, that really have privacy as a core value. So when we talk about privacy‐preserving technology (Harlo, in your mind) what do we really mean? What does it mean for a technology to be privacy‐preserving.
Holmes: First off I’d like to say that it’s actually not about certain apps or certain software or whatever, experiences. It’s actually about the fundamental theory of end‐to‐end encryption. There’s different types of encryption, stuff that we’re really really used to seeing like for instance TLS or SSL, HTTPS that you see in the browser, and that’s an example of transport security. For instance in a Facebook conversation, I can definitely be sure that my message to someone else is encrypted when it gets to Facebook, but once it gets to Facebook, Facebook has all ability to read it as they pass it on to its intended destination.
So end‐to‐end encryption is important because it ensures that literally only the two terminuses of a conversation can perform that encryption and decryption function, and everything else in the middle is gobbledygook. All apps, all pieces of software that fulfill this basic need, I consider privacy‐forward, and those are the things that I want to promote no matter what shell they come in. No matter what they look like, as long as they’re fun to use and fulfill that principle.
Sinclair Brody: Tyler, would you agree? Would you define privacy‐preserving technologies similarly, and if so or if not, how would you contrast it with the experiences that we have in most popular products.
Reinhard: I think that I do agree with that. I think that privacy is something that we can think of in terms of a civil right, as individuals. Something that we are able to consent to who can read our private ideas or private thoughts and who can’t. That’s a civil rights issue. But I think there’s also a way to think about it in terms of a social issue that’s larger than simply the individual. One example would be that over the course of the datafication of maps, we’ve all been using these maps to get around. One of the primary users of of those is people who professionally use those maps. And now we’re in a situation where companies are able to leverage that data to produce cars that can move themselves.
When you think about how that functions in terms of people whose livelihoods, like for example all Teamsters, using a service and generating value for a company and then being in a situation where perhaps their livelihood could be completely obliterated, their consent to having their data used against their economic interests is something that is a social issue as well. So I think privacy‐preserving technology should be thought of in terms of civil rights, but I think it’s important to think of it also in terms of how we organize and defend our livelihoods in the system and systems that we live in. So I guess that’s my answer.
Sinclair Brody: A recent Pew study found that 93% of participants in the study rated the ability to communicate confidentially with someone as being very or somewhat important, yet it seems like most people I know have never heard of end‐to‐end encryption, much less prioritize using it. Why do you think that is, Ame?
Elliot: I think people care about privacy, and people care about security, but I think they also care about communicating with the people they want to communicate with. What we’ve seen is that people build up and accrete these biographies over time, where they speak, where they communicate with different groups of people on a different platform. So if your grandparents are alive, you may be FaceTiming or Skyping with them. You may use Facebook to communicate with college friends, and WhatsApp to communicate with new professional people that you’re meeting on a job. You may have a robust inner life on Tinder. And it can be kind of awkward when you move people between those platforms.
So I think there’s an inherent dichotomy between the need and the delight and the pleasure of feeling private, and the awkwardness of expecting someone who’s downloading Signal for the first time to say, “Hey, my entire community, come with me onto this new platform.” That puts a bit of pressure on the people, and I think we need to find ways to get more people there.
Sinclair Brody: Tyler, what challenges do you think designers face, specifically, in trying to work in this privacy‐forward space, and what are the opportunities for them to make an impact.
Reinhard: I think the principal dilemma for designers is that we haven’t figured out a way to get away from a sort of data analysis‐driven economy. So if you want to make money as a designer, you’re going to work for the companies that can pay you to do that well. And you’re going to have to take on privacy work as a passion. So that passion, I think, is hard to place exactly in terms of what precisely are the problems that need solving in privacy from a design perspective, and how do we solve those problems in a way that’s competitive with an environment that is approaching completely different problems and presenting aesthetic standards.
For example, if you’re working on a messenger app, a lot of other messenger apps don’t have the same kinds of dilemmas. But the users of messenger apps expect them to work in a certain way and behave in a certain way. So how do you handle those challenges with existing standards? And to be positive about it, I think that designers and artists have a unique role in society in helping social values be associated with on one hand the passé. You get to say what conclusions we’ve already come to as a society, what is tacky, what is old fashioned ways of thinking about the world. We get to do that as designers and artists, and I think that’s an incredible power we shouldn’t abuse, and I think not abusing it means embracing privacy and encryption as a matter of social value I guess is how I would phrase that.
Sinclair Brody: Harlo, what’s your perspective? What are the challenges Technologists and designers face in collaborating on the creation of privacy‐preserving technologies.
Holmes: First off, taking privacy and security into your own hands as an end user is not always easy, and I find it a little bit disappointing that a lot of apps that you might download make it seem as if it were as easy as installing an app and starting chatting. For the most part, I think that one, we could as a community who is privacy‐forward and focused or whatever, put pressure on app builders in order to incorporate certain libraries that we know work and work well, and to actual just press the dialogue on that. And also, from a design perspective, although I’m definitely not a designer at all, design could go a long way in order to hold a user’s hand where need be in order to get them where they need to be in order to take privacy and encryption and security, etc. into their own hands.
Also, one thing that I think is actually really really successful. Maybe you guys hadn’t noticed this, maybe you have, but your Twitter direct messages are of unlimited length, which actually makes it a perfect vector for a PGP conversation. But those are little, little winks to people who are privacy‐focused to say that we have people on design staff who might actually have thought about this and care about this. And putting in little Easter eggs here and there— Oh, also, side‐note. Twitter like years ago enable you to proxy over SOCKS, so you could potentially send tweets over Tor, which is interesting.
That said (That might’ve been a lot of jargon for people in the audience. I’m very sorry. Deeply sorry.) there are Easter eggs that you can find in technologies that are already really really popular that, depending on your knowledge and depending on where you find your information and how that knowledge spreads, you might actually up your game security‐wise. So having the option to play it safe but also enabling users who want to swim in the deep end of the pool is actually really really cool and something that I would promote.
Sinclair Brody: Amy, what’s our path forward?
Elliot: I’m glad you asked, Scout. I already gave the pitch about the glass jar there, if you’re interested in this. I’m particularly looking to connect with user experience designers and interaction designers that are working on security and privacy, that’re in that deep end of the pool and that’re the people making these Easter eggs. I’d like to figure out how we as a community can believe in people and give people the good experiences that they want, where privacy and security are positives that bring people pleasure and confidence.
Sinclair Brody: Alright. Well, thank all three of you. Once again, the perspective that you heard from these three amazing people is just the start of the conversation, so find them afterward, and pick their brains. Particularly pick Harlo’s brain about all those gobbledygook words she was using, because they’re really useful.