Good after­noon, MozFest. I’m Simone, and I research and think about and some­times write about sur­veil­lance, and also teach at the University of Texas at Austin. The way I approach sur­veil­lance can be encap­su­lat­ed in this image right here.

This is a screen cap of a video from 2009. It is of Desi Cryer and Wanda Zamen. They call them­selves Black Desi and White Wanda in the video. And they’re two work­ers at a camp­ing store in Texas who are test­ing out a new HP facial auto­mat­ed track­ing sys­tem with the com­put­er. And one of the things that hap­pens in this video is that Black Desi says, Watch what hap­pens when my black­ness enters the frame.” And he’s talk­ing about the cam­er­a’s seem­ing inabil­i­ty to fol­low him or to pan or to zoom and fol­low his face. But when White Wanda gets in there, the facial track­ing sys­tem works. And so this ques­tion of what hap­pens when black­ness enters the frame can kind of neat­ly encap­su­late the ways I’ve been think­ing and try­ing to talk about sur­veil­lance for the last few years.

And so for exam­ple there is this. In 1966, Marie Van Brittan Brown, a nurse liv­ing in Queens, New York along with her hus­band Albert L. Brown invent­ed what they called a home secu­ri­ty sur­veil­lance sys­tem. This was 1966 CCTV. So Marie Van Brittan Brown was a nurse, and she would often trav­el home late at night.

It was quite intri­cate there. It con­sist­ed of a door­bell of that she could unlock the door from her bed, audio inter­com, and you can see this as a pre­cur­sor to mod­ern video door­bells or oth­er types of home sur­veil­lance sys­tems.

And so I kind of also can see from the dia­gram the rob­ber there— Well, you can tell it’s a rob­ber I guess because of the striped shirt and the sparsely-populated beard and the hat. So you could almost think of it as a kind of abo­li­tion­ist tech­nol­o­gy. She was real­ly con­cerned about the slow police response to Queens and their home when­ev­er peo­ple would call in cas­es of emer­gency. And so this do-it-yourself take on sur­veil­lance is one of the ways that I think about how black wom­en’s work has been absent­ed from sur­veil­lance technologies—how we think about them, how we the­o­rize them, and in this case how they are cre­at­ed. So one of the key things I do is I ask how our past can allow us to think crit­i­cal­ly about our present.

And so on this is an image from a law from New York City in the 1700s. They’re lantern laws that required that black, mixed-race, or indige­nous peo­ple, if they were to walk around the city after dark and they weren’t in the com­pa­ny of some white per­son, they would need to have with them a lit lantern as they moved about the city. If not, they could be tak­en up, arrest­ed, and put in the gaols until some own­er” will come and get them. They could also be sub­ject to beat­ings.

So this makes light a sur­veil­lance device, a super­vi­so­ry device, but it also cre­at­ed cer­tain humans as the light­ing infra­struc­ture of the city. And so I took this to think about 200 years lat­er or so, we have omnipresent polic­ing prac­tices where light is used. High-intensity flood­lights are shone into peo­ple homes as a form of sur­veil­lance, as a form of pro­tect­ing and light­ing up cer­tain spaces.

And so you see here this image is some­body who had took to Instagram to talk about the violence—thinking about the noise pol­lu­tion or the sound pol­lu­tion that comes from a large gen­er­a­tor. So just this sum­mer, the American Medical Association put out a warn­ing on the effects of LED lights [via], high-intensity lights in the city, that might have effects in terms of chang­ing humans’ car­diac rhyth­mic­i­ty, intense glare, and also heart pal­pi­ta­tions.

And so when I think about how the past allows us to ask crit­i­cal ques­tions about our present, I think about this light­ing tech­nol­o­gy, this infra­struc­ture, and the ways that 300 years ear­li­er in New York City, black, indige­nous, and mixed-race peo­ple were called upon or instruct­ed to car­ry lanterns with them as they moved about after dark. And you can think about what type of human life is val­ued as, or deval­ued as, infra­struc­ture.

So for exam­ple in Austin, Texas in 2012 at the South by Southwest Festival, one com­pa­ny took it upon them­selves to cre­ate home­less hotspots, where peo­ple who were made home­less or under­housed were fit­ted with WiFi devices so that they could then be home­less hotspots for peo­ple who need­ed to have their tech read­i­ly avail­able.

You have anoth­er way in which light is used as a form of dis­ci­pline. In cer­tain cities like Durham, North Carolina and also Tampa, Florida, there are ordi­nances put in place that man­date that peo­ple who are pan­han­dling or flier­ing are sup­posed to wear phos­pho­res­cent or glow-in-the-dark vests if they are to do so.

And so anoth­er thing that I looked at when I think about sur­veil­lance is the ways in which black women nego­ti­ate the TSA—the hair search­es as they go through the air­port. And they start with this: Solange Knowles, sis­ter of Beyoncé, took to Twitter a few years ago to com­plain about the discrim-fro-nation that she—when she was sub­ject­ed to a hair search by the TSA. And so you see that this social media site becomes the site of cri­tique of state prac­tices. And many oth­ers con­tin­ue to do so. They point to the ACLUs, they tell each oth­er to know their rights, and also form a gen­er­al­ized cri­tique of sur­veil­lance by way of Twitter and oth­er sites.

But I’m going to return to Black Desi and White Wanda to talk about bio­met­ric tech­nol­o­gy. You can think of bio­met­rics as doing a few things. They could be used for identification—so who are you in the face in the crowd, or even if you’re enrolled in a par­tic­u­lar bio­met­ric data­base. They could also be used for verification—so answer­ing the ques­tion Are you who you say you are?” And also could be used for automa­tion, in the case of Black Desi and White Wanda. So automation—is any­one there? And I looked at ear­li­er uses of bio­met­ric tech­nol­o­gy to see the ways in which peo­ple crit­i­cal­ly engaged and chal­lenged this mark­ing of iden­ti­ty on the body.

One of the ways that I do that is to his­tori­cize bio­met­rics through think­ing about the brand­ing of enslaved peo­ple. This is a carte de vis­ite of Wilson Chinn from the 1860s. And you can see around his neck he has a met­al col­lar. It’s called a long­horn. What you can’t see in this image is on his fore­head he has a brand, VBM” brand­ed on his fore­head. And so he lib­er­at­ed him­self and escaped from slav­ery, but seem­ing­ly with this brand it was impos­si­ble to escape this mark­ing on his fore­head. And you can think about this mark­ing as a trau­mat­ic head injury. And this is not to say that brand­ing of enslaved peo­ple and bio­met­rics tech­nol­o­gy are one and the same, but it’s to ask crit­i­cal ques­tions about how bio­met­rics, if you think it’s sim­ply as body mea­sure­ment, has been applied and used and resist­ed his­tor­i­cal­ly.

Whites Only?”; YouTube user Teej Meister

And so we have this image from a screen­grab from last year of a sink that seem­ing­ly did not work for dark hands but worked for light hands.

Jacky Alciné, Twitter

Or we have this image here, also from last year, of anoth­er automa­tion tech­nol­o­gy where some­one had uploaded images of his friend to a Google pho­to tag­ging or pho­to iden­ti­fi­ca­tion site, and [which] con­tin­u­al­ly cat­e­go­rized his friend, a black woman, as a goril­la. And so we have to con­tin­ue to think about what kind of train­ing data is used to pop­u­late these types of tech­nolo­gies.

I’m going to close with this right here. So, a cou­ple of weeks ago at the Georgetown University law school, they released a 150-page report on bio­met­rics and facial recog­ni­tion tech­nol­o­gy. One of the things that they found in this report is that in the US, over half of adult Americans have their bio­met­ric facial fea­tures in a database—one or more—that can be accessed by the police. And you can find this at www​.per​pet​u​alline​up​.org.

And so, when we think about the ways that the busi­ness end of sur­veil­lance meets the busi­ness end of polic­ing, I want us to con­tin­ue to be crit­i­cal about bio­met­ric tech­nol­o­gy, about algo­rithms, and who holds the pro­pri­etary data when peo­ple’s bod­ies, and parts and pieces of their bod­ies, or per­for­mances of their bodies—their biometrics—are held by—whether it’s Facebook or whether it is oth­er types of sites and polic­ing.

So what I will do is close right here. But I will close with the ques­tion that I start­ed with to think crit­i­cal­ly about what hap­pens when black­ness enters the frame. Because I think it offers us some pro­duc­tive pos­si­bil­i­ties for our future. Thank you.


Sarah Marshall: So, any­body got any ques­tions for Simone?

Audience 1: Seeta Peña Gangadharan, London School of Economics. So I real­ly enjoyed your talk. Thank you so much. I think it’s real­ly impor­tant that these kinds of con­ver­sa­tions are had in a set­ting that is his­tor­i­cal­ly not inclu­sive of con­ver­sa­tions about race and the inter­sec­tion of race and tech­nol­o­gy. So thank you.

But one of the things that I’m won­der­ing about—because as Sarah had said when she intro­duced you that you’re sort of at the inter­sec­tion of pri­va­cy, secu­ri­ty, and dig­i­tal inclu­sion. I was just won­der­ing if you could expand upon the third aspect of that, dig­i­tal inclu­sion, and maybe touch upon some of these pos­i­tive or trans­for­ma­tive things that you were allud­ing to at the end of your talk with your provoca­tive ques­tion.

Simone Browne: Okay. So I think that the TSA exam­ple of peo­ple talk­ing to TSA is trans­for­ma­tive. Someone link­ing on Twitter to the Know Your Rights cam­paign is talk­ing about not only using these tech­nolo­gies to think about what hap­pens with the secu­ri­ty the­ater in an air­port. And so I think of myself as more of a gad­fly. I think we have a par­tic­u­lar set of skills, and mine is not—you know, I’m not one of the peo­ple that are cre­at­ing or devel­op­ing these tech­nolo­gies. So I left with that thing about what hap­pens when black­ness enters the frame, and I think that black women using their Twitter to for­mu­late a cri­tique of the state is about inclusion—using the dig­i­tal to have a more equi­table kind of way in which we move through an air­port space.

And so my sug­ges­tion is not to say well, let’s teach young black girls how to code so that they could be more eas­i­ly exploit­ed when they get old­er into the dig­i­tal labor force. But to con­tin­ue to have these types of dis­cus­sions about what’s at stake when 50% of American adults have their face in a data­base that is acces­si­ble to the police, and when we think about the over- and hyper-policing of black peo­ple and peo­ple of col­or with­in the US and glob­al­ly, what does this mean for the ways in which bio­met­rics could be linked to crim­i­nal­iza­tion prac­tices.

Marshall: Good ques­tion. Anybody else got a ques­tion for Simone? Yes, I can see a gen­tle­man at the back.

Audience 2: So, a cou­ple of the exam­ples you gave—the sink, the facial recog­ni­tion algo­rithm, and I think I read that Microsoft Kinect has faced sim­i­lar issues—are quite obvi­ous­ly not mal­ice, most like­ly, but rather… Microsoft, for exam­ple, in the Kinect case said, Well, we test­ed it on our employ­ees who were like 95% white.” So obvi­ous­ly it’s clear that when test­ing these things, things like QA depart­ments will have to say, We can’t do every­thing. Some peo­ple are hand­i­capped. They might be hard to rec­og­nize if they’re facial­ly deformed or some­thing like that. But there’s a bunch of races and things like that that obvi­ous­ly there’s a lot of peo­ple that are going to be using this. We should be test­ing with that.” Do you see a change in that? Do you see com­pa­nies being more aware of this when they’re work­ing on bio­met­rics?

Browne: Yeah. And I think there’s a lot of capac­i­ty with how these com­pa­nies do and cre­ate and research and devel­op these things, so I don’t nec­es­sar­i­ly have the knowl­edge to track whether there’s a change. But it seems to me that the way that the white body becomes read as the default set­ting or pro­duced as neu­tral, as a kind of pro­to­typ­i­cal white­ness, con­tin­ues to hap­pen. And so you would have some­one use YouTube to say that a sink does­n’t work in 2015. Or in 2009 say that the cam­era does­n’t work. Or Kinect, or the one with the goril­la in 2015.

So every year or so, it seems that the same type of white neu­tral­i­ty seems to be used as the kind of pro­to­typ­i­cal body when devel­op­ing these things. And so it comes to the con­sumer or the users to use a place like Instagram, or Twitter, or Facebook to offer a cri­tique of who’s enter­ing the con­ver­sa­tions and the devel­op­ment of these tech­nolo­gies. And so per­haps it might not nec­es­sar­i­ly be the con­sumer’s job to show peo­ple how anti-black these tech­nolo­gies are.

Marshall: And to pick up on your point, then, part of the prob­lem is diver­si­ty in tech­nol­o­gy, in the work force, do you feel?

Browne: Yeah, I guess it could be diver­si­ty. But we could have a diverse amount of peo­ple and we could still have an anti-black frame into how these things are devel­oped. So I think it needs diver­si­ty but also equi­ty as well.

Marshall: Okay, any oth­er ques­tions for Simone? I have one more ques­tion I’m eager to ask. In your university—I mean, you’re here talk­ing to a bunch of tech­nol­o­gists, large­ly. And I’m guess­ing that the sub­jects you teach back in Austin, I’m guess­ing that you’re main­ly not talk­ing to tech­nol­o­gists. Or do you talk to tech­nol­o­gists about this, too?

Browne: Yeah. So, I have a vari­ety of stu­dents in my class, from elec­tri­cal engi­neer­ing to wom­en’s stud­ies and also black stud­ies. And so we can come togeth­er in a quite col­lab­o­ra­tive and inter­dis­ci­pli­nary way to talk about these tech­nolo­gies, yeah.

Marshall: Great. Okay, if there are no fur­ther ques­tions then let’s thank Simone.

Browne: Thanks so much.

Marshall: Thank you.

Further Reference

MozFest 2016 web site


Help Support Open Transcripts

If you found this useful or interesting, please consider supporting the project monthly at Patreon or once via Cash App, or even just sharing the link. Thanks.