Meredith Whittaker: Thank you guys so much for being here. I want to start out by just asking you guys to introduce yourself, and you all do so much but, how is your work related to themes of agency, power dynamics, and the determination of whether technology really works for you or not? I’ll start with you, Kate.
Kate Crawford: I just want to start by saying a huge thank you to Meredith, to everyone here at Pioneer Works, and for all you guys for coming out. This is a huge turnout. It’s amazing and incredibly exciting.
So how does my work relate? I’m a professor and a writer, and my work very much focuses on power asymmetries in large‐scale data collection. I look at that at different layers, I look at essentially empirical research around bio data. Also how data is collected in cities, how it’s collected in workplaces, and then finally how it’s collected by the state.
I sort of sit in a weird place in that I both have academic affiliations at MIT and NYU, but I’m also based in an industrial research lab at MSR. So I have the privilege of sitting down next to the computer scientists who in many cases are building some of the Big Data systems that are both fascinating to me but in many cases very concerning as well.
Whittaker: Thank you. Allison.
Allison Burtch: Hi. My name is Allison Burtch. I’m a technologist and a writer, and I have made work that has involved jamming, in certain situations. I made a Log Jammer, which is creating a safe space in nature away from technology. I’ve also done other political work. I organized a conference after PRISM at Eyebeam Art & Technology Center, and also with Occupy Wall Street.
Whittaker: Lauren, how does your work relate to the themes of agency and the ability to push back against or form technology.
Lauren McCarthy: Hi. I’m Lauren McCarthy and I guess I’m most interested in the systems and rules involved with being a person and interacting with other people in today’s technological reality, or just today. So what does that mean? I guess a couple of examples.
A couple of years ago I went on a bunch of dates and I tried to crowd‐source my dating life because I just wasn’t really cutting it on my own. So I streamed the dates to the Web with my phone and then I paid Mechanical Turk workers to watch the date and to decide what I should say and do, and I would get these messages and do them.
One other quick example. Just recently I finished a project with Kyle McDonald where we made an app that paired with a smartwatch so it could measure your biological signals and figure out how people and how your friends made you feel, so you don’t have to. Cuz who has time? Then it would automatically schedule them into your life or delete or block them, accordingly.
The point with all of these is to ask the question of, could a computer or could an algorithm actually make better social decisions than we could ourselves? And if so, how do we feel about that and what do we do? And what does it mean, “better?” What is improvement? And how are these ideas embedded into the systems that we use and the systems that we build?
Whittaker: Whoa. So one of the questions that I’m actually sort of struggling with in putting this together and thinking through this theme is where do we have a choice now? And I see your work as experimenting a lot with those dynamics, and I would love your take. Like, where do we have a say in what we do and do not accept? Where is agency a part of our relationship to technology?
McCarthy: I think that there’s always an opportunity to try to find the loopholes, to misuse or to reappropriate the technologies that we have. We see this a lot. We see these little glitches where we get into a part of a system or we use a tool or technology or an app in a way that wasn’t intended and we receive a kind of funny result. And I think this reveals something about the expectations of the people that made the system. No technology is neutral. There’s always these embedded assumptions, expectations, and biases.
So there’s ways like that, as an individual, to push back, or maybe as a community. But I think beyond that, sometimes the technology reveals our biases in ourselves, our expectations about each other. So I think one question we have to be asking while we’re looking for the place we can push back against technology and it’s expectations… Where can we push back against our own biases and expectations…? Can we create an environment where people actually feel supported and able to define their own identity? It’s not just the technology that’s limiting people. We make these technologies.
Whittaker: Kate, I want to direct the same question to you. You deal with sort of the undercurrent of a lot of the techno‐consumerism. Where do we have places to push back? What are the inflection points?
Crawford: It’s interesting because I worry that this debate about agency is basically the big lie. And I think the big lie is that we think we can control what we do, and yeah it’s fantastic that there are encrypted apps that we can use, and they’re really important, but what worries me is the first thing is that we’re making this an individual problem. We’re trying to say it’s on you, it’s your responsibility, figure this out, download this, understand end‐to‐end encryption, when it’s a shared problem and it’s a communal problem. And ultimately, we are valued as data points, collectively, far more than as individuals. This is a shared problem.
So how do we think beyond that very individualistic frame into something that’s more communal? I think that’s the first big problem and why I want to get past this kind of figure it out yourself, kids perspective that comes out in these debates.
I think the other thing that’s really interesting about agency here, too, is that we think about this historically as something that we can fix now. But this has got a really long trajectory, both behind it (I was really glad to hear Sarah talk about what was happening in the 40s and 50s) but we can also think twenty, thirty years out from now, where the data that we’ve already released continues to be used in particular ways. So we have this much bigger trajectory to start thinking about in terms of agency.
But if you want me to say something positive, and I can kinda tell that you do, I think one of the more exciting projects that I did recently was with my colleague Luke Stark at NYU. We interviewed just under forty artists from the US, from Europe, and from the Middle East. Also included in this project are of course Allison and Lauren, because they do kick‐ass work in this space. And we specifically talked to artists about what kinds of provocative interventions are they making by using computational platforms. But we also asked them, “What wouldn’t you do? Where is the line where you say this is not okay, this is not an ethical use of other people’s data?”
And what I loved about doing this study, and we are just about to finish it so I’ll share it with you soon, is the realization that artists are thinking about this stuff in some ways in much more sophisticated frameworks than the computer scientists who so often get the stage to talk about data and ethics. And I really want to basically highlight this community, who are thinking about these kinds of problems in ways that just don’t get heard enough.
Whittaker: Yes. Thank you.
So I’m just riffing on this. We have a narrative of consumer choice, like you can take it or leave it, like where the invisible hand, the free market, we are agents, we are individuals. I would like to direct this to you, Allison. Where does that stop and start, and what would, in your view, actual ability to say no or to openly embrace technology look like?
Burtch: I think what Kate said about individual choice is really important in this because when you say we’re framing this as a “no” to something, and I’m more interested what is the “yes,” what is the other thing? Because when we look at our lives as a disconnection from, then we’re always disconnecting. And that’s actually not a stable place. So a lot of the privacy discussion, again, is about individual choice but we live in public and we need to do political work in public. So it’s not all about this neoliberal “I can hide in my room and buy drugs on the darknet and that’s good for me,” which is like, whatever, fine. But Anwar al Awlaki’s son got drone‐bombed. What’s his privacy choice? We’re making these technologies that when we talk about privacy it’s this super American, bourgeois, “I can have all these things and download all these apps.” But what we’re facing is a much bigger, collective, political issue. If that was clear at all.
Whittaker: I have written down here “It takes a village to make a social network social.” So just jumping to the theme of interdependence and community and our sort primate sociality, you know, intersubjective beings, I would love you to continue talking on that theme.
Burtch: Sure. Okay. Sort of riffing on that, and this is something that I’ve been thinking about a lot. We all talk about the end of the world a lot. Like the Anthropocene, we’re all totally fucked, and so what I’m wondering about is, so with the mindset that we’re totally fucked, all we have is a survival strategy. So all we have is, how can we protect ourselves from not being totally fucked? Whereas, what if that wasn’t true? Hypothetically. What’s actually the common horizon that we can build together that we want? And so that’s what I’ve been thinking about. Survival strategy versus common horizon, how do we protect ourselves individually versus what are we working towards together, and the sort of mindset that goes into that.
Whittaker: And Kate, I would love your view on that question as well, [as] someone who works with humans as statistics, the aggregate numbers, data as value. How does that intersect with the idea of intersubjective humans?
Crawford: Small question. Tiny question. This is really interesting, because obviously you could take a really big picture here and say that subjectivity itself is shifting in really interesting ways. We can look at our own histories with this kind of granularity and capacity that we didn’t have as recently as fifteen years ago. How does that change our understanding of ourselves? There are tools that will help you think about that.
What we’re less good at, I think, is using that capacity to say, “What’s changed in our political landscape? What’s changed in our ability to join a union, go to a protest, express an unpopular opinion?” These activities that used to be seen as so core to democratic functioning have a very different valency if you’re being recorded every time you do into the street. If your email is in the clear and let’s face it, it’s pretty easy for any state agency to actually look at that, given certain sort of policy restrictions.
So I feel like there’s some very big questions that we need to think through around what is the political now? And how is that shaped without those spaces of at least semi‐anonymity that became so much a part of how we understood the political process in say, the 20th century?
Whittaker: Yes. I want to know the answer to that question. Lauren, I want to turn it to you for a second. A lot of your work deals with actually building communities and isolated experiences of technology. Can you speak to the sort of dichotomies that like, what are the awkwardnesses, the sort of tensions of trying to involve more than one person in those experiences?
McCarthy: We’re in a space now where there’s so much more transparency, so much more connection, so much more is public. And I think there are some things that are good about this. We’ve seen the potential for there to be real social change because of some important issue that rises to the surface. And I think that’s good, and what we need to be doing is trying to listen to people that are different than you, and to let that happen.
But at the same time I think a lot of these technologies that are all about connection or sharing or social or whatever are actually a little bit more isolating than connecting. They bring out this feeling of competition. I think there’s something a little dangerous whenever there’s an interface between people, because you forget a little bit that they’re a person when you can’t see them or feel them in front of you. So I wonder if we could build systems that aren’t focused on fostering ego, but instead on fostering understanding, on fostering empathy.
Whittaker: I’m going to put this to all of you as sort of a closer. Where should people who want to do that engage? Where do you see the opportunities or the wedges or the spaces in these systems to engage in a way that would allow that kind of relationality, to allow that kind of choice?
Burtch: That’s… [long pause]
Crawford: That’s my answer, too.
Burtch: That’s not on the thing. I’m just trying to have fun. I’m just trying to live a fun life, because… I don’t know. Everything gets so terrible, and like what’s the point if everything’s just miserable all the time? We need to actually build communities and relationships and intimacy and beauty and joy and figure out how to build stuff— Because you can’t fight alone. The lone wolf is over. You need to fight together. And imperialism and capitalism and colonialism have done a number on us psychologically. So we… I don’t really know where I’m going with that. But you asked me where I find the wedges, and I’m just right now trying to do jiu‐jitsu, basically. That’s what I’m doing.
Whittaker: Amen. That was heartfelt. Lauren, you got a take?
McCarthy: I don’t know if I can follow that, but I’ll just pick it up from Allison. I liked when you said, “What is your yes?” When we’re thinking about issues of surveillance and privacy, there’s so much fear a lot of times. And I get we need to think about these things, but just flat out rejection of a technology or “I don’t like that” isn’t really going to cut it, because it’s here and we’re moving forward. So how could we move forward in a more productive way, and how can we kind of dig through this gray area instead of just saying, “It’s all black, I’m not going to deal with it?” And I guess for me, I start with little personal experiments, little tests. Then someday you might do something and you realize oh I didn’t realize that wasn’t a wall there, I could actually go into that space. And then it gets bigger. So maybe I could do that. How could I bring other people with me, or how could we do this as a community?
Crawford: Alright, well I’ll give you a really personal answer and then I’ll give you a slightly bigger picture answer.
The personal answer of the thing that I do that gives me hope in this is, right now, Deep Lab, which I’m very thrilled to say has two members on stage here with Harlo and Allison, which is a group of feminist artists and researchers who are trying to think about particular kinds of interventions we could make together because we’re stronger to have more of us than just working alone. And that’s been really fantastic.
At a bigger picture level, I think what can we do? Yes, it’s really important that we do develop stronger communities, that we do develop stronger tools, and that we support those communities doing encryption tool development. But I don’t want us to feel like we have to retreat and that we have to hide. I also want us to make a lot of noise. I want us to think about what are the most public statements we can make collectively? What are the pressure points in terms of what’s happening around public policy, around the communities that are really fighting these issues? And many of those communities are vulnerable and marginalized communities. How do we help them?
So for me these are the ways that I really want to figure out what are those pressure points? What’s the jiu‐jitsu? And how do we figure that out together?
Whittaker: Yes. Thank you.