Meredith Whittaker: Thank you guys so much for being here. I want to start out by just ask­ing you guys to intro­duce your­self, and you all do so much but, how is your work relat­ed to themes of agency, pow­er dynam­ics, and the deter­mi­na­tion of whether tech­nol­o­gy real­ly works for you or not? I’ll start with you, Kate.

Kate Crawford: I just want to start by say­ing a huge thank you to Meredith, to every­one here at Pioneer Works, and for all you guys for com­ing out. This is a huge turnout. It’s amaz­ing and incred­i­bly excit­ing.

So how does my work relate? I’m a pro­fes­sor and a writer, and my work very much focus­es on pow­er asym­me­tries in large-scale data col­lec­tion. I look at that at dif­fer­ent lay­ers, I look at essen­tial­ly empir­i­cal research around bio data. Also how data is col­lect­ed in cities, how it’s col­lect­ed in work­places, and then final­ly how it’s col­lect­ed by the state.

I sort of sit in a weird place in that I both have aca­d­e­m­ic affil­i­a­tions at MIT and NYU, but I’m also based in an indus­tri­al research lab at MSR. So I have the priv­i­lege of sit­ting down next to the com­put­er sci­en­tists who in many cas­es are build­ing some of the Big Data sys­tems that are both fas­ci­nat­ing to me but in many cas­es very con­cern­ing as well.

Whittaker: Thank you. Allison.

Allison Burtch: Hi. My name is Allison Burtch. I’m a tech­nol­o­gist and a writer, and I have made work that has involved jam­ming, in cer­tain sit­u­a­tions. I made a Log Jammer, which is cre­at­ing a safe space in nature away from tech­nol­o­gy. I’ve also done oth­er polit­i­cal work. I orga­nized a con­fer­ence after PRISM at Eyebeam Art & Technology Center, and also with Occupy Wall Street.

Whittaker: Lauren, how does your work relate to the themes of agency and the abil­i­ty to push back against or form tech­nol­o­gy.

Lauren McCarthy: Hi. I’m Lauren McCarthy and I guess I’m most inter­est­ed in the sys­tems and rules involved with being a per­son and inter­act­ing with oth­er peo­ple in today’s tech­no­log­i­cal real­i­ty, or just today. So what does that mean? I guess a cou­ple of exam­ples.

A cou­ple of years ago I went on a bunch of dates and I tried to crowd-source my dat­ing life because I just wasn’t real­ly cut­ting it on my own. So I streamed the dates to the Web with my phone and then I paid Mechanical Turk work­ers to watch the date and to decide what I should say and do, and I would get these mes­sages and do them.

One oth­er quick exam­ple. Just recent­ly I fin­ished a project with Kyle McDonald where we made an app that paired with a smart­watch so it could mea­sure your bio­log­i­cal sig­nals and fig­ure out how peo­ple and how your friends made you feel, so you don’t have to. Cuz who has time? Then it would auto­mat­i­cal­ly sched­ule them into your life or delete or block them, accord­ing­ly.

The point with all of these is to ask the ques­tion of, could a com­put­er or could an algo­rithm actu­al­ly make bet­ter social deci­sions than we could our­selves? And if so, how do we feel about that and what do we do? And what does it mean, bet­ter?” What is improve­ment? And how are these ideas embed­ded into the sys­tems that we use and the sys­tems that we build?

Whittaker: Whoa. So one of the ques­tions that I’m actu­al­ly sort of strug­gling with in putting this togeth­er and think­ing through this theme is where do we have a choice now? And I see your work as exper­i­ment­ing a lot with those dynam­ics, and I would love your take. Like, where do we have a say in what we do and do not accept? Where is agency a part of our rela­tion­ship to tech­nol­o­gy?

McCarthy: I think that there’s always an oppor­tu­ni­ty to try to find the loop­holes, to mis­use or to reap­pro­pri­ate the tech­nolo­gies that we have. We see this a lot. We see these lit­tle glitch­es where we get into a part of a sys­tem or we use a tool or tech­nol­o­gy or an app in a way that wasn’t intend­ed and we receive a kind of fun­ny result. And I think this reveals some­thing about the expec­ta­tions of the peo­ple that made the sys­tem. No tech­nol­o­gy is neu­tral. There’s always these embed­ded assump­tions, expec­ta­tions, and bias­es.

So there’s ways like that, as an indi­vid­ual, to push back, or maybe as a com­mu­ni­ty. But I think beyond that, some­times the tech­nol­o­gy reveals our bias­es in our­selves, our expec­ta­tions about each oth­er. So I think one ques­tion we have to be ask­ing while we’re look­ing for the place we can push back against tech­nol­o­gy and it’s expec­ta­tions… Where can we push back against our own bias­es and expec­ta­tions…? Can we cre­ate an envi­ron­ment where peo­ple actu­al­ly feel sup­port­ed and able to define their own iden­ti­ty? It’s not just the tech­nol­o­gy that’s lim­it­ing peo­ple. We make these tech­nolo­gies.

Whittaker: Kate, I want to direct the same ques­tion to you. You deal with sort of the under­cur­rent of a lot of the techno-consumerism. Where do we have places to push back? What are the inflec­tion points?

Crawford: It’s inter­est­ing because I wor­ry that this debate about agency is basi­cal­ly the big lie. And I think the big lie is that we think we can con­trol what we do, and yeah it’s fan­tas­tic that there are encrypt­ed apps that we can use, and they’re real­ly impor­tant, but what wor­ries me is the first thing is that we’re mak­ing this an indi­vid­ual prob­lem. We’re try­ing to say it’s on you, it’s your respon­si­bil­i­ty, fig­ure this out, down­load this, under­stand end-to-end encryp­tion, when it’s a shared prob­lem and it’s a com­mu­nal prob­lem. And ulti­mate­ly, we are val­ued as data points, col­lec­tive­ly, far more than as indi­vid­u­als. This is a shared prob­lem.

So how do we think beyond that very indi­vid­u­al­is­tic frame into some­thing that’s more com­mu­nal? I think that’s the first big prob­lem and why I want to get past this kind of fig­ure it out your­self, kids per­spec­tive that comes out in these debates.

I think the oth­er thing that’s real­ly inter­est­ing about agency here, too, is that we think about this his­tor­i­cal­ly as some­thing that we can fix now. But this has got a real­ly long tra­jec­to­ry, both behind it (I was real­ly glad to hear Sarah talk about what was hap­pen­ing in the 40s and 50s) but we can also think twen­ty, thir­ty years out from now, where the data that we’ve already released con­tin­ues to be used in par­tic­u­lar ways. So we have this much big­ger tra­jec­to­ry to start think­ing about in terms of agency.

But if you want me to say some­thing pos­i­tive, and I can kin­da tell that you do, I think one of the more excit­ing projects that I did recent­ly was with my col­league Luke Stark at NYU. We inter­viewed just under forty artists from the US, from Europe, and from the Middle East. Also includ­ed in this project are of course Allison and Lauren, because they do kick-ass work in this space. And we specif­i­cal­ly talked to artists about what kinds of provoca­tive inter­ven­tions are they mak­ing by using com­pu­ta­tion­al plat­forms. But we also asked them, What wouldn’t you do? Where is the line where you say this is not okay, this is not an eth­i­cal use of oth­er people’s data?”

And what I loved about doing this study, and we are just about to fin­ish it so I’ll share it with you soon, is the real­iza­tion that artists are think­ing about this stuff in some ways in much more sophis­ti­cat­ed frame­works than the com­put­er sci­en­tists who so often get the stage to talk about data and ethics. And I real­ly want to basi­cal­ly high­light this com­mu­ni­ty, who are think­ing about these kinds of prob­lems in ways that just don’t get heard enough.

Whittaker: Yes. Thank you.

So I’m just riff­ing on this. We have a nar­ra­tive of con­sumer choice, like you can take it or leave it, like where the invis­i­ble hand, the free mar­ket, we are agents, we are indi­vid­u­als. I would like to direct this to you, Allison. Where does that stop and start, and what would, in your view, actu­al abil­i­ty to say no or to open­ly embrace tech­nol­o­gy look like?

Burtch: I think what Kate said about indi­vid­ual choice is real­ly impor­tant in this because when you say we’re fram­ing this as a no” to some­thing, and I’m more inter­est­ed what is the yes,” what is the oth­er thing? Because when we look at our lives as a dis­con­nec­tion from, then we’re always dis­con­nect­ing. And that’s actu­al­ly not a sta­ble place. So a lot of the pri­va­cy dis­cus­sion, again, is about indi­vid­ual choice but we live in pub­lic and we need to do polit­i­cal work in pub­lic. So it’s not all about this neolib­er­al I can hide in my room and buy drugs on the dark­net and that’s good for me,” which is like, what­ev­er, fine. But Anwar al Awlaki’s son got drone-bombed. What’s his pri­va­cy choice? We’re mak­ing these tech­nolo­gies that when we talk about pri­va­cy it’s this super American, bour­geois, I can have all these things and down­load all these apps.” But what we’re fac­ing is a much big­ger, col­lec­tive, polit­i­cal issue. If that was clear at all.

Whittaker: I have writ­ten down here It takes a vil­lage to make a social net­work social.” So just jump­ing to the theme of inter­de­pen­dence and com­mu­ni­ty and our sort pri­mate social­i­ty, you know, inter­sub­jec­tive beings, I would love you to con­tin­ue talk­ing on that theme.

Burtch: Sure. Okay. Sort of riff­ing on that, and this is some­thing that I’ve been think­ing about a lot. We all talk about the end of the world a lot. Like the Anthropocene, we’re all total­ly fucked, and so what I’m won­der­ing about is, so with the mind­set that we’re total­ly fucked, all we have is a sur­vival strat­e­gy. So all we have is, how can we pro­tect our­selves from not being total­ly fucked? Whereas, what if that wasn’t true? Hypothetically. What’s actu­al­ly the com­mon hori­zon that we can build togeth­er that we want? And so that’s what I’ve been think­ing about. Survival strat­e­gy ver­sus com­mon hori­zon, how do we pro­tect our­selves indi­vid­u­al­ly ver­sus what are we work­ing towards togeth­er, and the sort of mind­set that goes into that.

Whittaker: And Kate, I would love your view on that ques­tion as well, [as] some­one who works with humans as sta­tis­tics, the aggre­gate num­bers, data as val­ue. How does that inter­sect with the idea of inter­sub­jec­tive humans?

Crawford: Small ques­tion. Tiny ques­tion. This is real­ly inter­est­ing, because obvi­ous­ly you could take a real­ly big pic­ture here and say that sub­jec­tiv­i­ty itself is shift­ing in real­ly inter­est­ing ways. We can look at our own his­to­ries with this kind of gran­u­lar­i­ty and capac­i­ty that we didn’t have as recent­ly as fif­teen years ago. How does that change our under­stand­ing of our­selves? There are tools that will help you think about that.

What we’re less good at, I think, is using that capac­i­ty to say, What’s changed in our polit­i­cal land­scape? What’s changed in our abil­i­ty to join a union, go to a protest, express an unpop­u­lar opin­ion?” These activ­i­ties that used to be seen as so core to demo­c­ra­t­ic func­tion­ing have a very dif­fer­ent valen­cy if you’re being record­ed every time you do into the street. If your email is in the clear and let’s face it, it’s pret­ty easy for any state agency to actu­al­ly look at that, giv­en cer­tain sort of pol­i­cy restric­tions.

So I feel like there’s some very big ques­tions that we need to think through around what is the polit­i­cal now? And how is that shaped with­out those spaces of at least semi-anonymity that became so much a part of how we under­stood the polit­i­cal process in say, the 20th cen­tu­ry?

Whittaker: Yes. I want to know the answer to that ques­tion. Lauren, I want to turn it to you for a sec­ond. A lot of your work deals with actu­al­ly build­ing com­mu­ni­ties and iso­lat­ed expe­ri­ences of tech­nol­o­gy. Can you speak to the sort of dichotomies that like, what are the awk­ward­ness­es, the sort of ten­sions of try­ing to involve more than one per­son in those expe­ri­ences?

McCarthy: We’re in a space now where there’s so much more trans­paren­cy, so much more con­nec­tion, so much more is pub­lic. And I think there are some things that are good about this. We’ve seen the poten­tial for there to be real social change because of some impor­tant issue that ris­es to the sur­face. And I think that’s good, and what we need to be doing is try­ing to lis­ten to peo­ple that are dif­fer­ent than you, and to let that hap­pen.

But at the same time I think a lot of these tech­nolo­gies that are all about con­nec­tion or shar­ing or social or what­ev­er are actu­al­ly a lit­tle bit more iso­lat­ing than con­nect­ing. They bring out this feel­ing of com­pe­ti­tion. I think there’s some­thing a lit­tle dan­ger­ous when­ev­er there’s an inter­face between peo­ple, because you for­get a lit­tle bit that they’re a per­son when you can’t see them or feel them in front of you. So I won­der if we could build sys­tems that aren’t focused on fos­ter­ing ego, but instead on fos­ter­ing under­stand­ing, on fos­ter­ing empa­thy.

Whittaker: I’m going to put this to all of you as sort of a clos­er. Where should peo­ple who want to do that engage? Where do you see the oppor­tu­ni­ties or the wedges or the spaces in these sys­tems to engage in a way that would allow that kind of rela­tion­al­i­ty, to allow that kind of choice?

Burtch: That’s… [long pause]

Crawford: That’s my answer, too.

Burtch: That’s not on the thing. I’m just try­ing to have fun. I’m just try­ing to live a fun life, because… I don’t know. Everything gets so ter­ri­ble, and like what’s the point if everything’s just mis­er­able all the time? We need to actu­al­ly build com­mu­ni­ties and rela­tion­ships and inti­ma­cy and beau­ty and joy and fig­ure out how to build stuff— Because you can’t fight alone. The lone wolf is over. You need to fight togeth­er. And impe­ri­al­ism and cap­i­tal­ism and colo­nial­ism have done a num­ber on us psy­cho­log­i­cal­ly. So we… I don’t real­ly know where I’m going with that. But you asked me where I find the wedges, and I’m just right now try­ing to do jiu-jitsu, basi­cal­ly. That’s what I’m doing.

Whittaker: Amen. That was heart­felt. Lauren, you got a take?

McCarthy: I don’t know if I can fol­low that, but I’ll just pick it up from Allison. I liked when you said, What is your yes?” When we’re think­ing about issues of sur­veil­lance and pri­va­cy, there’s so much fear a lot of times. And I get we need to think about these things, but just flat out rejec­tion of a tech­nol­o­gy or I don’t like that” isn’t real­ly going to cut it, because it’s here and we’re mov­ing for­ward. So how could we move for­ward in a more pro­duc­tive way, and how can we kind of dig through this gray area instead of just say­ing, It’s all black, I’m not going to deal with it?” And I guess for me, I start with lit­tle per­son­al exper­i­ments, lit­tle tests. Then some­day you might do some­thing and you real­ize oh I didn’t real­ize that wasn’t a wall there, I could actu­al­ly go into that space. And then it gets big­ger. So maybe I could do that. How could I bring oth­er peo­ple with me, or how could we do this as a com­mu­ni­ty?

Crawford: Alright, well I’ll give you a real­ly per­son­al answer and then I’ll give you a slight­ly big­ger pic­ture answer.

The per­son­al answer of the thing that I do that gives me hope in this is, right now, Deep Lab, which I’m very thrilled to say has two mem­bers on stage here with Harlo and Allison, which is a group of fem­i­nist artists and researchers who are try­ing to think about par­tic­u­lar kinds of inter­ven­tions we could make togeth­er because we’re stronger to have more of us than just work­ing alone. And that’s been real­ly fan­tas­tic.

At a big­ger pic­ture lev­el, I think what can we do? Yes, it’s real­ly impor­tant that we do devel­op stronger com­mu­ni­ties, that we do devel­op stronger tools, and that we sup­port those com­mu­ni­ties doing encryp­tion tool devel­op­ment. But I don’t want us to feel like we have to retreat and that we have to hide. I also want us to make a lot of noise. I want us to think about what are the most pub­lic state­ments we can make col­lec­tive­ly? What are the pres­sure points in terms of what’s hap­pen­ing around pub­lic pol­i­cy, around the com­mu­ni­ties that are real­ly fight­ing these issues? And many of those com­mu­ni­ties are vul­ner­a­ble and mar­gin­al­ized com­mu­ni­ties. How do we help them?

So for me these are the ways that I real­ly want to fig­ure out what are those pres­sure points? What’s the jiu-jitsu? And how do we fig­ure that out togeth­er?

Whittaker: Yes. Thank you.


Help Support Open Transcripts

If you found this useful or interesting, please consider supporting the project monthly at Patreon or once via Square Cash, or even just sharing the link. Thanks.