Good after­noon, MozFest. I’m Simone, and I research and think about and some­times write about sur­veil­lance, and also teach at the University of Texas at Austin. The way I approach sur­veil­lance can be encap­su­lat­ed in this image right here. 

This is a screen cap of a video from 2009. It is of Desi Cryer and Wanda Zamen. They call them­selves Black Desi and White Wanda in the video. And they’re two work­ers at a camp­ing store in Texas who are test­ing out a new HP facial auto­mat­ed track­ing sys­tem with the com­put­er. And one of the things that hap­pens in this video is that Black Desi says, Watch what hap­pens when my black­ness enters the frame.” And he’s talk­ing about the camera’s seem­ing inabil­i­ty to fol­low him or to pan or to zoom and fol­low his face. But when White Wanda gets in there, the facial track­ing sys­tem works. And so this ques­tion of what hap­pens when black­ness enters the frame can kind of neat­ly encap­su­late the ways I’ve been think­ing and try­ing to talk about sur­veil­lance for the last few years.

And so for exam­ple there is this. In 1966, Marie Van Brittan Brown, a nurse liv­ing in Queens, New York along with her hus­band Albert L. Brown invent­ed what they called a home secu­ri­ty sur­veil­lance sys­tem. This was 1966 CCTV. So Marie Van Brittan Brown was a nurse, and she would often trav­el home late at night.

It was quite intri­cate there. It con­sist­ed of a door­bell of that she could unlock the door from her bed, audio inter­com, and you can see this as a pre­cur­sor to mod­ern video door­bells or oth­er types of home sur­veil­lance sys­tems.

And so I kind of also can see from the dia­gram the rob­ber there— Well, you can tell it’s a rob­ber I guess because of the striped shirt and the sparsely-populated beard and the hat. So you could almost think of it as a kind of abo­li­tion­ist tech­nol­o­gy. She was real­ly con­cerned about the slow police response to Queens and their home when­ev­er peo­ple would call in cas­es of emer­gency. And so this do-it-yourself take on sur­veil­lance is one of the ways that I think about how black women’s work has been absent­ed from sur­veil­lance technologies—how we think about them, how we the­o­rize them, and in this case how they are cre­at­ed. So one of the key things I do is I ask how our past can allow us to think crit­i­cal­ly about our present.

And so on this is an image from a law from New York City in the 1700s. They’re lantern laws that required that black, mixed-race, or indige­nous peo­ple, if they were to walk around the city after dark and they weren’t in the com­pa­ny of some white per­son, they would need to have with them a lit lantern as they moved about the city. If not, they could be tak­en up, arrest­ed, and put in the gaols until some own­er” will come and get them. They could also be sub­ject to beat­ings.

So this makes light a sur­veil­lance device, a super­vi­so­ry device, but it also cre­at­ed cer­tain humans as the light­ing infra­struc­ture of the city. And so I took this to think about 200 years lat­er or so, we have omnipresent polic­ing prac­tices where light is used. High-intensity flood­lights are shone into peo­ple homes as a form of sur­veil­lance, as a form of pro­tect­ing and light­ing up cer­tain spaces.

And so you see here this image is some­body who had took to Instagram to talk about the violence—thinking about the noise pol­lu­tion or the sound pol­lu­tion that comes from a large gen­er­a­tor. So just this sum­mer, the American Medical Association put out a warn­ing on the effects of LED lights [via], high-intensity lights in the city, that might have effects in terms of chang­ing humans’ car­diac rhyth­mic­i­ty, intense glare, and also heart pal­pi­ta­tions.

And so when I think about how the past allows us to ask crit­i­cal ques­tions about our present, I think about this light­ing tech­nol­o­gy, this infra­struc­ture, and the ways that 300 years ear­li­er in New York City, black, indige­nous, and mixed-race peo­ple were called upon or instruct­ed to car­ry lanterns with them as they moved about after dark. And you can think about what type of human life is val­ued as, or deval­ued as, infra­struc­ture.

So for exam­ple in Austin, Texas in 2012 at the South by Southwest Festival, one com­pa­ny took it upon them­selves to cre­ate home­less hotspots, where peo­ple who were made home­less or under­housed were fit­ted with WiFi devices so that they could then be home­less hotspots for peo­ple who need­ed to have their tech read­i­ly avail­able.

You have anoth­er way in which light is used as a form of dis­ci­pline. In cer­tain cities like Durham, North Carolina and also Tampa, Florida, there are ordi­nances put in place that man­date that peo­ple who are pan­han­dling or flier­ing are sup­posed to wear phos­pho­res­cent or glow-in-the-dark vests if they are to do so.

And so anoth­er thing that I looked at when I think about sur­veil­lance is the ways in which black women nego­ti­ate the TSA—the hair search­es as they go through the air­port. And they start with this: Solange Knowles, sis­ter of Beyoncé, took to Twitter a few years ago to com­plain about the discrim-fro-nation that she—when she was sub­ject­ed to a hair search by the TSA. And so you see that this social media site becomes the site of cri­tique of state prac­tices. And many oth­ers con­tin­ue to do so. They point to the ACLUs, they tell each oth­er to know their rights, and also form a gen­er­al­ized cri­tique of sur­veil­lance by way of Twitter and oth­er sites.

But I’m going to return to Black Desi and White Wanda to talk about bio­met­ric tech­nol­o­gy. You can think of bio­met­rics as doing a few things. They could be used for identification—so who are you in the face in the crowd, or even if you’re enrolled in a par­tic­u­lar bio­met­ric data­base. They could also be used for verification—so answer­ing the ques­tion Are you who you say you are?” And also could be used for automa­tion, in the case of Black Desi and White Wanda. So automation—is any­one there? And I looked at ear­li­er uses of bio­met­ric tech­nol­o­gy to see the ways in which peo­ple crit­i­cal­ly engaged and chal­lenged this mark­ing of iden­ti­ty on the body. 

One of the ways that I do that is to his­tori­cize bio­met­rics through think­ing about the brand­ing of enslaved peo­ple. This is a carte de vis­ite of Wilson Chinn from the 1860s. And you can see around his neck he has a met­al col­lar. It’s called a long­horn. What you can’t see in this image is on his fore­head he has a brand, “VBM” brand­ed on his fore­head. And so he lib­er­at­ed him­self and escaped from slav­ery, but seem­ing­ly with this brand it was impos­si­ble to escape this mark­ing on his fore­head. And you can think about this mark­ing as a trau­mat­ic head injury. And this is not to say that brand­ing of enslaved peo­ple and bio­met­rics tech­nol­o­gy are one and the same, but it’s to ask crit­i­cal ques­tions about how bio­met­rics, if you think it’s sim­ply as body mea­sure­ment, has been applied and used and resist­ed his­tor­i­cal­ly.

Whites Only?”; YouTube user Teej Meister

And so we have this image from a screen­grab from last year of a sink that seem­ing­ly did not work for dark hands but worked for light hands. 

Jacky Alciné, Twitter

Or we have this image here, also from last year, of anoth­er automa­tion tech­nol­o­gy where some­one had uploaded images of his friend to a Google pho­to tag­ging or pho­to iden­ti­fi­ca­tion site, and [which] con­tin­u­al­ly cat­e­go­rized his friend, a black woman, as a goril­la. And so we have to con­tin­ue to think about what kind of train­ing data is used to pop­u­late these types of tech­nolo­gies.

I’m going to close with this right here. So, a cou­ple of weeks ago at the Georgetown University law school, they released a 150-page report on bio­met­rics and facial recog­ni­tion tech­nol­o­gy. One of the things that they found in this report is that in the US, over half of adult Americans have their bio­met­ric facial fea­tures in a database—one or more—that can be accessed by the police. And you can find this at www​.per​pet​u​alline​up​.org.

And so, when we think about the ways that the busi­ness end of sur­veil­lance meets the busi­ness end of polic­ing, I want us to con­tin­ue to be crit­i­cal about bio­met­ric tech­nol­o­gy, about algo­rithms, and who holds the pro­pri­etary data when people’s bod­ies, and parts and pieces of their bod­ies, or per­for­mances of their bodies—their biometrics—are held by—whether it’s Facebook or whether it is oth­er types of sites and polic­ing.

So what I will do is close right here. But I will close with the ques­tion that I start­ed with to think crit­i­cal­ly about what hap­pens when black­ness enters the frame. Because I think it offers us some pro­duc­tive pos­si­bil­i­ties for our future. Thank you.


Discussion

Sarah Allen: So, anybody got any questions for Simone?

Audience 1: Seeta Peña Gangadharan, London School of Economics. So I really enjoyed your talk. Thank you so much. I think it's really important that these kinds of conversations are had in a setting that is historically not inclusive of conversations about race and the intersection of race and technology. So thank you.

But one of the things that I'm wondering about—because as Sarah had said when she introduced you that you're sort of at the intersection of privacy, security, and digital inclusion. I was just wondering if you could expand upon the third aspect of that, digital inclusion, and maybe touch upon some of these positive or transformative things that you were alluding to at the end of your talk with your provocative question.

Simone Browne: Okay. So I think that the TSA example of people talking to TSA is transformative. Someone linking on Twitter to the Know Your Rights campaign is talking about not only using these technologies to think about what happens with the security theater in an airport. And so I think of myself as more of a gadfly. I think we have a particular set of skills, and mine is not—you know, I'm not one of the people that are creating or developing these technologies. So I left with that thing about what happens when blackness enters the frame, and I think that black women using their Twitter to formulate a critique of the state is about inclusion—using the digital to have a more equitable kind of way in which we move through an airport space.

And so my suggestion is not to say well, let's teach young black girls how to code so that they could be more easily exploited when they get older into the digital labor force. But to continue to have these types of discussions about what's at stake when 50% of American adults have their face in a database that is accessible to the police, and when we think about the over- and hyper-policing of black people and people of color within the US and globally, what does this mean for the ways in which biometrics could be linked to criminalization practices.

Allen: Good question. Anybody else got a question for Simone? Yes, I can see a gentleman at the back.

Audience 2: So, a couple of the examples you gave—the sink, the facial recognition algorithm, and I think I read that Microsoft Kinect has faced similar issues—are quite obviously not malice, most likely, but rather… Microsoft, for example, in the Kinect case said, "Well, we tested it on our employees who were like 95% white." So obviously it's clear that when testing these things, things like QA departments will have to say, "We can't do everything. Some people are handicapped. They might be hard to recognize if they're facially deformed or something like that. But there's a bunch of races and things like that that obviously there's a lot of people that are going to be using this. We should be testing with that." Do you see a change in that? Do you see companies being more aware of this when they're working on biometrics?

Browne: Yeah. And I think there's a lot of capacity with how these companies do and create and research and develop these things, so I don't necessarily have the knowledge to track whether there's a change. But it seems to me that the way that the white body becomes read as the default setting or produced as neutral, as a kind of prototypical whiteness, continues to happen. And so you would have someone use YouTube to say that a sink doesn't work in 2015. Or in 2009 say that the camera doesn't work. Or Kinect, or the one with the gorilla in 2015.

So every year or so, it seems that the same type of white neutrality seems to be used as the kind of prototypical body when developing these things. And so it comes to the consumer or the users to use a place like Instagram, or Twitter, or Facebook to offer a critique of who's entering the conversations and the development of these technologies. And so perhaps it might not necessarily be the consumer's job to show people how anti-black these technologies are.

Allen: And to pick up on your point, then, part of the problem is diversity in technology, in the work force, do you feel?

Browne: Yeah, I guess it could be diversity. But we could have a diverse amount of people and we could still have an anti-black frame into how these things are developed. So I think it needs diversity but also equity as well.

Allen: Okay, any other questions for Simone? I have one more question I'm eager to ask. In your university—I mean, you're here talking to a bunch of technologists, largely. And I'm guessing that the subjects you teach back in Austin, I'm guessing that you're mainly not talking to technologists. Or do you talk to technologists about this, too?

Browne: Yeah. So, I have a variety of students in my class, from electrical engineering to women's studies and also black studies. And so we can come together in a quite collaborative and interdisciplinary way to talk about these technologies, yeah.

Allen: Great. Okay, if there are no further questions then let's thank Simone.

Browne: Thanks so much.

Allen: Thank you.

Further Reference

MozFest 2016 web site


Help Support Open Transcripts

If you found this useful or interesting, please consider supporting the project monthly at Patreon or once via Square Cash, or even just sharing the link. Thanks.