Scout Sinclair Brody: We’ve got a full pro­gram, so I’m going to ask y’all to keep it brief, but you all are involved in one way or anoth­er in the cre­ation of tech­nol­o­gy. So tell us a lit­tle bit about who you are and what you do in this space. Let’s start with Tyler.

Tyler Reinhard: Hi, every­one. My name’s Tyler Reinhard. I’m a design­er and a pub­lish­er. I began that work in 2000. I made my first web site when I was in high school to sell under­ground rad­i­cal zines online via PO Box. And ever since then I’ve been work­ing on projects like that, with vary­ing esca­la­tions. Right now I’m work­ing on Signal, the pri­va­cy app for iOS.

Sinclair Brody: Get it.

Reinhard: Yeah, get it. It’s pret­ty cool.

Harlo Holmes: And/or RedPhone.

Reinhard: Yeah, and/or Redphone and TextSecure. Those three are a suite of tools [since com­bined as just Signal] that me and some oth­er peo­ple work on with a project called Open Whisper Systems. I’m also the pub­lish­er of Mask Magazine, which is a low-brow approach to dis­cussing polit­i­cal issues for the Internet gen­er­a­tion, I guess. So that’s me.

Sinclair Brody: Awesome. Harlo.

Holmes: Hey. I’m Harlo Holmes. I’m a dig­i­tal secu­ri­ty train­er for the Freedom of the Press Foundation based out of San Francisco but tech­ni­cal­ly I’m the New York office, just me. I have a pro­fes­sion­al back­ground in soft­ware devel­op­ment, pri­mar­i­ly with The Guardian Project, which is a col­lec­tive that builds a bunch of privacy-forward mobile apps. And through The Guardian Project, I got to learn very very inti­mate­ly about cryp­to sys­tems. I’m not a cryp­tog­ra­ph­er, I’m not a math­e­mati­cian, but I’m real­ly real­ly good at APIs and real­ly real­ly good at using libraries, so I’ve come to a point where hav­ing got­ten to inti­mate­ly learn how these code­bas­es work am able to explain to peo­ple, jour­nal­ists, whistle­blow­ers, librar­i­ans, high school stu­dents, you name it, how these pieces of code come to bear on the apps that they use. I’ve always been real­ly real­ly inter­est­ed, even since I was a kid, in secrets and how they’re pro­tect­ed and how they’re stored and kept, even though mine are real­ly real­ly boring.

Sinclair Brody: Alright, Ame.

Ame Elliot: I’m Ame Elliot. I’m a user expe­ri­ence design­er. I love mobile apps that make you hap­py and that feel good under your thumb.

Sinclair Brody: Wonderful. The pre­vi­ous pan­el talked a bit about pri­va­cy as some­thing that we’re strug­gling for. There are, how­ev­er, a num­ber of tech­nolo­gies, we’ve men­tioned a cou­ple already, that real­ly have pri­va­cy as a core val­ue. So when we talk about privacy-preserving tech­nol­o­gy (Harlo, in your mind) what do we real­ly mean? What does it mean for a tech­nol­o­gy to be privacy-preserving.

Holmes: First off I’d like to say that it’s actu­al­ly not about cer­tain apps or cer­tain soft­ware or what­ev­er, expe­ri­ences. It’s actu­al­ly about the fun­da­men­tal the­o­ry of end-to-end encryp­tion. There’s dif­fer­ent types of encryp­tion, stuff that we’re real­ly real­ly used to see­ing like for instance TLS or SSL, HTTPS that you see in the brows­er, and that’s an exam­ple of trans­port secu­ri­ty. For instance in a Facebook con­ver­sa­tion, I can def­i­nite­ly be sure that my mes­sage to some­one else is encrypt­ed when it gets to Facebook, but once it gets to Facebook, Facebook has all abil­i­ty to read it as they pass it on to its intend­ed destination.

So end-to-end encryp­tion is impor­tant because it ensures that lit­er­al­ly only the two ter­mi­nus­es of a con­ver­sa­tion can per­form that encryp­tion and decryp­tion func­tion, and every­thing else in the mid­dle is gob­bledy­gook. All apps, all pieces of soft­ware that ful­fill this basic need, I con­sid­er privacy-forward, and those are the things that I want to pro­mote no mat­ter what shell they come in. No mat­ter what they look like, as long as they’re fun to use and ful­fill that principle.

Sinclair Brody: Tyler, would you agree? Would you define privacy-preserving tech­nolo­gies sim­i­lar­ly, and if so or if not, how would you con­trast it with the expe­ri­ences that we have in most pop­u­lar products.

Reinhard: I think that I do agree with that. I think that pri­va­cy is some­thing that we can think of in terms of a civ­il right, as indi­vid­u­als. Something that we are able to con­sent to who can read our pri­vate ideas or pri­vate thoughts and who can’t. That’s a civ­il rights issue. But I think there’s also a way to think about it in terms of a social issue that’s larg­er than sim­ply the indi­vid­ual. One exam­ple would be that over the course of the datafi­ca­tion of maps, we’ve all been using these maps to get around. One of the pri­ma­ry users of of those is peo­ple who pro­fes­sion­al­ly use those maps. And now we’re in a sit­u­a­tion where com­pa­nies are able to lever­age that data to pro­duce cars that can move themselves.

When you think about how that func­tions in terms of peo­ple whose liveli­hoods, like for exam­ple all Teamsters, using a ser­vice and gen­er­at­ing val­ue for a com­pa­ny and then being in a sit­u­a­tion where per­haps their liveli­hood could be com­plete­ly oblit­er­at­ed, their con­sent to hav­ing their data used against their eco­nom­ic inter­ests is some­thing that is a social issue as well. So I think privacy-preserving tech­nol­o­gy should be thought of in terms of civ­il rights, but I think it’s impor­tant to think of it also in terms of how we orga­nize and defend our liveli­hoods in the sys­tem and sys­tems that we live in. So I guess that’s my answer.

Sinclair Brody: A recent Pew study found that 93% of par­tic­i­pants in the study rat­ed the abil­i­ty to com­mu­ni­cate con­fi­den­tial­ly with some­one as being very or some­what impor­tant, yet it seems like most peo­ple I know have nev­er heard of end-to-end encryp­tion, much less pri­or­i­tize using it. Why do you think that is, Ame?

Elliot: I think peo­ple care about pri­va­cy, and peo­ple care about secu­ri­ty, but I think they also care about com­mu­ni­cat­ing with the peo­ple they want to com­mu­ni­cate with. What we’ve seen is that peo­ple build up and accrete these biogra­phies over time, where they speak, where they com­mu­ni­cate with dif­fer­ent groups of peo­ple on a dif­fer­ent plat­form. So if your grand­par­ents are alive, you may be FaceTiming or Skyping with them. You may use Facebook to com­mu­ni­cate with col­lege friends, and WhatsApp to com­mu­ni­cate with new pro­fes­sion­al peo­ple that you’re meet­ing on a job. You may have a robust inner life on Tinder. And it can be kind of awk­ward when you move peo­ple between those platforms.

So I think there’s an inher­ent dichoto­my between the need and the delight and the plea­sure of feel­ing pri­vate, and the awk­ward­ness of expect­ing some­one who’s down­load­ing Signal for the first time to say, Hey, my entire com­mu­ni­ty, come with me onto this new plat­form.” That puts a bit of pres­sure on the peo­ple, and I think we need to find ways to get more peo­ple there.

Sinclair Brody: Tyler, what chal­lenges do you think design­ers face, specif­i­cal­ly, in try­ing to work in this privacy-forward space, and what are the oppor­tu­ni­ties for them to make an impact.

Reinhard: I think the prin­ci­pal dilem­ma for design­ers is that we haven’t fig­ured out a way to get away from a sort of data analysis-driven econ­o­my. So if you want to make mon­ey as a design­er, you’re going to work for the com­pa­nies that can pay you to do that well. And you’re going to have to take on pri­va­cy work as a pas­sion. So that pas­sion, I think, is hard to place exact­ly in terms of what pre­cise­ly are the prob­lems that need solv­ing in pri­va­cy from a design per­spec­tive, and how do we solve those prob­lems in a way that’s com­pet­i­tive with an envi­ron­ment that is approach­ing com­plete­ly dif­fer­ent prob­lems and pre­sent­ing aes­thet­ic standards. 

For exam­ple, if you’re work­ing on a mes­sen­ger app, a lot of oth­er mes­sen­ger apps don’t have the same kinds of dilem­mas. But the users of mes­sen­ger apps expect them to work in a cer­tain way and behave in a cer­tain way. So how do you han­dle those chal­lenges with exist­ing stan­dards? And to be pos­i­tive about it, I think that design­ers and artists have a unique role in soci­ety in help­ing social val­ues be asso­ci­at­ed with on one hand the passé. You get to say what con­clu­sions we’ve already come to as a soci­ety, what is tacky, what is old fash­ioned ways of think­ing about the world. We get to do that as design­ers and artists, and I think that’s an incred­i­ble pow­er we should­n’t abuse, and I think not abus­ing it means embrac­ing pri­va­cy and encryp­tion as a mat­ter of social val­ue I guess is how I would phrase that.

Sinclair Brody: Harlo, what’s your per­spec­tive? What are the chal­lenges Technologists and design­ers face in col­lab­o­rat­ing on the cre­ation of privacy-preserving technologies.

Holmes: First off, tak­ing pri­va­cy and secu­ri­ty into your own hands as an end user is not always easy, and I find it a lit­tle bit dis­ap­point­ing that a lot of apps that you might down­load make it seem as if it were as easy as installing an app and start­ing chat­ting. For the most part, I think that one, we could as a com­mu­ni­ty who is privacy-forward and focused or what­ev­er, put pres­sure on app builders in order to incor­po­rate cer­tain libraries that we know work and work well, and to actu­al just press the dia­logue on that. And also, from a design per­spec­tive, although I’m def­i­nite­ly not a design­er at all, design could go a long way in order to hold a user’s hand where need be in order to get them where they need to be in order to take pri­va­cy and encryp­tion and secu­ri­ty, etc. into their own hands.

Also, one thing that I think is actu­al­ly real­ly real­ly suc­cess­ful. Maybe you guys had­n’t noticed this, maybe you have, but your Twitter direct mes­sages are of unlim­it­ed length, which actu­al­ly makes it a per­fect vec­tor for a PGP con­ver­sa­tion. But those are lit­tle, lit­tle winks to peo­ple who are privacy-focused to say that we have peo­ple on design staff who might actu­al­ly have thought about this and care about this. And putting in lit­tle Easter eggs here and there— Oh, also, side-note. Twitter like years ago enable you to proxy over SOCKS, so you could poten­tial­ly send tweets over Tor, which is interesting.

That said (That might’ve been a lot of jar­gon for peo­ple in the audi­ence. I’m very sor­ry. Deeply sor­ry.) there are Easter eggs that you can find in tech­nolo­gies that are already real­ly real­ly pop­u­lar that, depend­ing on your knowl­edge and depend­ing on where you find your infor­ma­tion and how that knowl­edge spreads, you might actu­al­ly up your game security-wise. So hav­ing the option to play it safe but also enabling users who want to swim in the deep end of the pool is actu­al­ly real­ly real­ly cool and some­thing that I would promote.

Sinclair Brody: Amy, what’s our path forward?

Elliot: I’m glad you asked, Scout. I already gave the pitch about the glass jar there, if you’re inter­est­ed in this. I’m par­tic­u­lar­ly look­ing to con­nect with user expe­ri­ence design­ers and inter­ac­tion design­ers that are work­ing on secu­ri­ty and pri­va­cy, that’re in that deep end of the pool and that’re the peo­ple mak­ing these Easter eggs. I’d like to fig­ure out how we as a com­mu­ni­ty can believe in peo­ple and give peo­ple the good expe­ri­ences that they want, where pri­va­cy and secu­ri­ty are pos­i­tives that bring peo­ple plea­sure and confidence.

Sinclair Brody: Alright. Well, thank all three of you. Once again, the per­spec­tive that you heard from these three amaz­ing peo­ple is just the start of the con­ver­sa­tion, so find them after­ward, and pick their brains. Particularly pick Harlo’s brain about all those gob­bledy­gook words she was using, because they’re real­ly useful.