Roderic N. Crooks: I'm introducing to you Dr. Safiya Noble, Assistant Professor at the University of Southern California's Annenberg School of Communication. Previously, she was a professor at the Department of Information Studies at UCLA, where she held appointments also in the departments of African American Studies and Gender Studies. She is cofounder of the Information Ethics & Equity Institute, an accrediting body that uses research to train data and information workers in issues related to social justice and workforce diversity.

She is also partner at Stratelligence, a firm that works with organizations to develop strategy based on informatics research in areas that include justice and ethics, labor and management, health and well-being. And as if that weren't enough, in addition to her current book project Dr. Noble is coeditor of The Intersectional Internet: Race, Sex, Class and Culture Online, and Emotions, Technology and Design: Communication of Feelings Through, With and For Technology. She's written numerous articles about race, gender, and the information professions, among other topics.

So I'm especially excited that Dr. Noble could join us this quarter. She provides a model for scholarship that is technologically sophisticated, politically engaged, and generative in its approach to disciplinarity. Please join me in welcoming her.

Safiya Noble: Thank you Roderic for that love­ly invi­ta­tion. You all real­ly scored when you got him to come and be part of your team here. We miss him in LA, and he was a favorite of ours at UCLA, so real­ly hap­py for you to be here. And so pleased to join you this after­noon and to be com­pet­ing with not just the beau­ti­ful sun­ny, hot, win­ter day, which still is amaz­ing to me after hav­ing spent thir­teen years at the University of Illinois in Illinois, to come home and real­ly enjoy my life thor­ough­ly with these days.

So thanks for hav­ing me. I thought today I would talk a bit about the forth­com­ing book, Algorithms of Oppression, and then also maybe leave a lit­tle bit of space for us to dia­logue. So I’m going to move fair­ly quick­ly through some of this, just so that we have enough time to also stay engaged in con­ver­sa­tion. I’m going to set even a timer for myself to warn me that we’re just about out of time. Alright.

Some of you might be famil­iar with this. This is a cam­paign from October 21st of 2013, when the United Nations teamed up with an ad agency, Memac Ogilvy & Mather Dubai where they cre­at­ed this cam­paign kind of using what they called gen­uine Google search­es.” This cam­paign was designed to bring crit­i­cal atten­tion to kind of the sex­ist ways in which women were regard­ed and still denied human rights. Over the mouths of var­i­ous women of col­or, were the auto-suggestions that appeared when search­es were engaged, and they placed those auto-suggestions over the faces, the mouths of these women.

So for exam­ple, when they start­ed to search women can­not…,” Google auto-populated dri­ve, be bish­ops, be trust­ed, speak in church.” Women shouldn’t…” have rights vote, work, box. Women should…” stay at home, be slaves, be in the kitchen, not speak in church. Women need to…” be put in their places, know their place, be con­trolled, be dis­ci­plined.

Now, what was inter­est­ing to me about this cam­paign when it first appeared is that it real­ly char­ac­ter­ized, was pre­sent­ed in a way that many peo­ple think about Google search, and par­tic­u­lar­ly auto-suggestion, which is that it is strict­ly a mat­ter of what is most pop­u­lar, and that the kinds of things that we find in search are strict­ly a mat­ter of what’s most pop­u­lar.

The cam­paign in fact said, The ads are shock­ing because they show just how far we still have to go to achieve gen­der equal­i­ty. They are a wake up call, and we hope that the mes­sage will trav­el far,” not­ed Kareem Shuhaibar, who was a copy­writer for the cam­paign who was kind of quot­ed in the United Nations web site.

Now, I found this cam­paign inter­est­ing because I thought that maybe we could spend some time look­ing at cam­paigns like this and a whole host of kind of fail­ures of Google search to talk about what oth­er process­es might in fact be involved with the kind of infor­ma­tion that we find there. And quite frankly what I think is at stake, most­ly for com­mu­ni­ties who are already mar­gin­al­ized or dis­en­fran­chised and how this might in fact exac­er­bate that.

So I want to just give— You know, I’m not real­ly want to give trig­ger warn­ings in my class­es. But I’ll give the trig­ger warn­ing that if you love Google, you’re going to be of super mad at me lat­er. So that’s your warn­ing.

Alright, here’s a sto­ry you might be fair­ly famil­iar with. Here’s The Washington Post. This was about a year and a half ago. Deray McKesson, who’s a fair­ly well-known activist on Twitter—some of you might be famil­iar with him. He became fair­ly well-known around Ferguson in par­tic­u­lar, and advo­cat­ing for Mike Brown online. He real­ly became super pop­u­lar after Beyoncé fol­lowed him and then he spiked in fol­low­ers. So I just feel com­pelled us say that if any­one here knows Beyoncé and you can get her to fol­low me, it would like real­ly ampli­fy the work. Just throw­ing that out there.

So Deray tweets,

And what was hap­pen­ing at the time was that if you did a search on the “‘n-word’ house” or the “‘n-word’ king,” Google Maps would take you to the White House. And this was dur­ing the pres­i­den­cy, obvi­ous­ly, of Barack Obama, who some of us still wish was pres­i­dent.

So The Washington Post, US News, everyone’s kind of con­tact­ing Google, try­ing to get a quote from them on how could this hap­pen. And this is a very typ­i­cal kind of Silicon Valley response, or a typical—not just from Google but from many Silicon Valley com­pa­nies, which is, Some inap­pro­pri­ate results are sur­fac­ing in Google Maps that should not be, and we apol­o­gize for any offense this may have caused. Our teams are work­ing to fix this issue quick­ly.”

There’s kind of two things going on here. First, we apol­o­gize for any offense this may have caused.” I know for me that when my hus­band says some­thing like you know, I apol­o­gize if you’re offend­ed,” I actu­al­ly don’t feel apol­o­gized to. So it’s like a weird, inter­est­ing non-apology apol­o­gy, which is some­thing that we often see from big cor­po­ra­tions in gen­er­al, which is if there’s like one ran­dom per­son in the world who might’ve been offend­ed by this then we apol­o­gize. As if the whole notion of tak­ing respon­si­bil­i­ty for the results in fact that their algo­rithm pro­duced would in fact be offen­sive and we could just real­ly pow­er­ful claim that.

But more impor­tant­ly I think in their state­ment, that our teams are work­ing to fix this issue quick­ly” is real­ly kind of a point­ed way in which many tech com­pa­nies pre­sume that their plat­forms are work­ing kind of per­fect­ly, and that this is a momen­tary glitch in a sys­tem, right. And so this is anoth­er way that we often see dis­course com­ing out of tech com­pa­nies.

The work that I do is real­ly kind of sit­u­at­ed in using kind of a crit­i­cal infor­ma­tion stud­ies lens. So I’m going to talk about that and I’ll share with you some of the peo­ple who’ve real­ly influ­enced the work that I’m doing. But one of the things that I think is real­ly impor­tant is that we’re pay­ing atten­tion to how we might be able to recu­per­ate and recov­er from these kinds of prac­tices. So rather than think­ing of this as just a tem­po­rary kind of glitch, in fact I’m going to show you sev­er­al of these glitch­es and maybe we might see a pat­tern.

Here we have Kabir Ali. Some of you might remem­ber this. Kabir Ali, about a year ago he took screen­shots of a video. He’s a young teenag­er who has his friend video him as he does a search for three black teenagers.” And when he does a search on three black teenagers, as you can see almost every sin­gle shot is some type of crim­i­nal­ized mug shot-type of image of black teenagers. And then he says to his friend, let’s see what hap­pens when we change one word. And he changes the word black” to white,” and then we get some type of strange Getty pho­tog­ra­phy, stock images of white teenagers play­ing mul­ti­ple sports appar­ent­ly at one time that kind of don’t go togeth­er. And this is the kind of ide­al­ized way in which white teenagers are por­trayed.

And so this sto­ry also goes viral quick­ly. Jessica Guynn in fact, who is the tech edi­tor for USA Today (If you want to fol­low kind of good crit­i­cal report­ing of the tech sec­tor, I would say on take a look at what the tech edi­tors at USA Today are doing.), she cov­ers the sto­ry, as do many oth­ers. And instead of issu­ing an apol­o­gy after that inci­dent, Google just qui­et­ly tweaks the algo­rithm. And the next day @bobsburgersguy on Twitter notices that the algo­rithm has changed.

Now, what’s inter­est­ing are again the choic­es that are made. Google adds in a young white man who’s in court who’s actu­al­ly being arraigned on charges of hate crime, along with kind of keep­ing— And so this idea that to cor­rect the real­i­ty or a par­tic­u­lar truth, we’re going to also add white crim­i­nal pic­tures, again legit­i­mat­ing that the black crim­i­nal­i­ty was actu­al­ly legit­i­mate. And then you know, we throw in like a cou­ple of black girls play­ing vol­ley­ball, and appar­ent­ly that’s the fix. Okay. So this is again kind of the qui­et response.

Bonnie Kamona, Twitter, 4÷5÷2016

Here’s anoth­er fail­ure. Google search­es, this is a sto­ry that went viral. When you did image search­es on unpro­fes­sion­al hair­styles for work,” you were giv­en exclu­sive­ly black women with nat­ur­al hair. So I wore my unpro­fes­sion­al hair­styles for work—because that’s actu­al­ly the hair­style I wear every day—for you. And you know, when you change this to pro­fes­sion­al hair­styles for work,” you are giv­en exclu­sive­ly white women with pony­tails and updos. And I often try to explain to my white col­leagues that it’s the pony­tail and the updo that real­ly make you pro­fes­sion­al. And also being white.

The algo­rith­mic assess­ment of infor­ma­tion, then, rep­re­sents a par­tic­u­lar knowl­edge log­ic, one built on spe­cif­ic pre­sump­tions about what knowl­edge is and how one should iden­ti­fy its most rel­e­vant com­po­nents. That we are now turn­ing to algo­rithms to iden­ti­fy what we need to know is as momen­tous as hav­ing relied on cre­den­tialed experts, the sci­en­tif­ic method, com­mon sense, or the word of God.
Gillespie, Tarleton. The Relevance of Algorithms.” in Media Technologies, ed. Tarleton Gillespie, Pablo Boczkowski, and Kirsten Foot. Cambridge, MA: MIT Press. [pre­sen­ta­tion slide]

So what does this mean? I mean, one of the things that is real­ly help­ful is kind of look­ing to peo­ple like Tarleton Gillespie. I think he says it per­fect­ly in one of his pieces about the rel­e­vance and impor­tance of algo­rithms. He says that we’re now turn­ing to algo­rithms to iden­ti­fy what what we need to know is as momen­tous as hav­ing relied on cre­den­tialed experts, the sci­en­tif­ic method, com­mon sense, or the word of God.”

And I’m going to sug­gest some­thing that may be com­mon sense to some of you, might seem a lit­tle provoca­tive for oth­ers. Which is we have this idea that algo­rithms, or com­put­er­i­za­tion, auto­mat­ed decision-making, some­how can do bet­ter than human beings can do in our decision-making, in our assess­ment. This is kind of part of the dis­course around algo­rith­mic decision-making or auto­mat­ed decision-making.

And these prac­tices to me are quite inter­est­ing because they rise his­tor­i­cal­ly at a moment. In the 1960s we start to see a rise of deep invest­ment in auto­mat­ed decision-making, com­put­er­i­za­tion, the kind of phe­nom­e­na that moves to the fore into the 1980s, desk­top com­put­ing. And this coin­cides his­tor­i­cal­ly at the very moment that women and peo­ple of col­or are legal­ly allowed to par­tic­i­pate in man­age­ment and decision-making process­es.

So I find this inter­est­ing that at the moment that we have more democ­ra­ti­za­tion of decision-making in kind of the high­est ech­e­lons of gov­ern­ment, edu­ca­tion, and indus­try, we also have the rise of a belief or an ide­ol­o­gy that com­put­ers can make bet­ter deci­sions than human beings. And I find this to be an inter­est­ing kind of ten­sion, and these are the kinds of things that I look at and explore in my work.

So let me just say you know that every aca­d­e­m­ic has to have a slide that has too many words on it. And that’s this slide. So just bear with me for a minute and I’m going to do bet­ter. But I think it’s impor­tant espe­cial­ly for grad stu­dents who are in the audi­ence and under­grad­u­ates, if you’re inter­est­ed in think­ing kind of in an inter­dis­ci­pli­nary way about dig­i­tal tech­nolo­gies, for me when I was a grad stu­dent and I start­ed think­ing about black fem­i­nism and crit­i­cal race the­o­ry and kind of the infor­ma­tion sci­ence and tech­nol­o­gy work that I was being exposed to, I was often met with kind of deri­sion around that. In fact I remem­ber I was just shar­ing with some of the grad stu­dents ear­li­er today at lunch that I can remem­ber being in a research lab meet­ing and one of my pro­fes­sors say­ing some­thing to me like, What is black fem­i­nism? Who’s ever even heard of that?” And I was like, pret­ty much every­body in gen­der stud­ies and black stud­ies. But I guess…they don’t count?

But what I found at the time when I was think­ing about racist and sex­ist bias in algo­rithms and tech­nol­o­gy plat­forms, par­tic­u­lar­ly Google, that was a hard kind of idea to get through. This is like 2010 or so when I kind of first start­ed think­ing about and research­ing this. Now you know, everyone’s talk­ing about biased algo­rithms. But there was a time when you know, that just didn’t—that like my mother-in-law wasn’t talk­ing about it. Do you know what I’m say­ing? Like it’s in the head­lines now all the time.

So I just encour­age you to think about, for those of you who are stu­dents, what are the frame­works that you’re pulling in from oth­er places out­side of kind of the tra­di­tion­al fields of library and infor­ma­tion sci­ence, or infor­ma­tion sci­ence, infor­ma­tion stud­ies, com­put­er sci­ence, that can help you ask dif­fer­ent ques­tions. And that’s what I was inter­est­ed in doing, is at the time the peo­ple who were writ­ing about Google kind of around 2010, 11, 12, were real­ly think­ing about the eco­nom­ic bias, for exam­ple, of Google, and how Google pri­or­i­tizes its own inter­ests or its invest­ments for exam­ple, over oth­ers. And we see this in the ear­ly work of peo­ple like Helen Nissenbaum, who was writ­ing back in 2000 about how search tech­nolo­gies hold bias, or Lucas Introna.

But I was inter­est­ed in the dis­pro­por­tion­ate harm that might come to racial­ized and gen­dered peo­ple through these kinds of bias­es. And again I think that when we take our own— I think of this in like kind of— You know, I was a soci­ol­o­gy major as an under­grad, and so I was real­ly influ­enced by peo­ple like C. Wright Mills in think­ing about the pri­vate trou­bles that I had, and might those also be pub­lic con­cerns. I was per­son­al­ly trou­bled by the fact that I was see­ing peo­ple who look like me being char­ac­ter­ized in a par­tic­u­lar way, and peo­ple that I cared about who were part of the com­mu­ni­ties that I was a part of kind of being char­ac­ter­ized a par­tic­u­lar way. And then I start­ed to real­ize that this was more of a pub­lic phe­nom­e­non, not just a pri­vate trou­ble.

So here I’m think­ing about things like the social con­struc­tion of tech­nol­o­gy. Now, if you’re in gen­der stud­ies or you’re in eth­nic stud­ies, black stud­ies, Latinx stud­ies, American Indian stud­ies, we talk about social con­struc­tions of race all the time. Or social con­struc­tions of gen­der. We’re com­fort­able with those kinds of fram­ings about race and gen­der not being bio­log­i­cal, not being nat­ur­al, not being fixed, but in fact being a mat­ter of pow­er rela­tions kind of his­tor­i­cal­ly sit­u­at­ed; flu­id, dynam­ic things that can change over time in their meet­ing.

And so I was very drawn to the social con­struc­tion of tech­nol­o­gy the­o­rists. People like Langdon Winner, Arnold Pacey, who were real­ly help­ing us under­stand that the tech­nolo­gies that we’re engag­ing with are in fact not flat, they’re not neu­tral, but they’re also laden with pow­er, and they are con­struc­tions of human beings. And so then what might human beings be putting into the dig­i­tal tech­nolo­gies that we’re think­ing about?

The oth­er dimen­sion of my work in that it’s black fem­i­nist and kind of engag­ing with crit­i­cal race the­o­ry, is I was inter­est­ed in, and I con­tin­ue to be in the things I’m think­ing about, inter­est­ed in…you know, things that are action­able, things that mat­ter to me in the world, and that are leg­i­ble, also. So for exam­ple, when I say you know, My mother-in-law doesn’t know any­thing about algo­rithms.” But I can talk to her. She knows that messed-up things hap­pen to black peo­ple. And she can under­stand that my work is try­ing to engage there. You know what I’m say­ing? So these are the kinds of things where when we think about research that can make a dif­fer­ence in the world, I think that’s impor­tant. I also use— Obviously I’m try­ing to bol­ster this field crit­i­cal infor­ma­tion stud­ies and add a voice to that.

Alright, so why Google? Quickly, because Google is a monop­oly and they con­trol the search mar­ket, along with oth­er kinds of mar­kets. Pew did a series of stud­ies on search engine use. They did one in 2005. The last one they did was in 2012. They do these kind of track­ing stud­ies on peo­ple who use search engines and what they think, and I’m just going to share with you a cou­ple of head­lines from their last study. Back in 2012, 83% of the mar­ket of peo­ple who use search engines used Google. More than that if we look at mobile.

So peo­ple often ask me—I’m just cut­ting this off now before we get to the Q&A, don’t ask me why I don’t look at Bing. Because nobody’s using Bing, okay? That’s why. That’s the easy answer. Study the monop­oly lead­ers because everyone’s try­ing to repli­cate what they’re doing. And so this is why it’s impor­tant to study that.

Here’s what’s espe­cial­ly alarm­ing out of the Pew study. They say that accord­ing to these users of search engines, they report gen­er­al­ly good out­comes and rel­a­tive­ly high con­fi­dence in the capa­bil­i­ties of search engines. Seventy-three per­cent of search engine users say that most or all of the infor­ma­tion they find as they use search engines is accu­rate and trust­wor­thy. Seventy-three per­cent say accu­rate and trust­wor­thy. Sixty-six per­cent of search engine users say search engines are a fair and unbi­ased source of infor­ma­tion.

Alright, so this is inter­est­ing to me. Now, I’ve been giv­ing this talk and talk­ing about the book for a while. And you know, it’s inter­est­ing to me to give this kind of post- the last pres­i­den­tial elec­tion. Because now again, peo­ple have a much high­er sense of like, Hey, wait a minute. Maybe plat­forms are doing some­thing that we hadn’t thought about before in terms of mis­in­for­ma­tion or cir­cu­lat­ing disinfor­ma­tion.” I think of it also anoth­er way when I’m in a more cyn­i­cal, pes­simistic mood—I might say you know, nobody cared about these kinds of bias­es when they were biased against women and girls of col­or, but now every­body cares because it’s thrown a pres­i­den­tial elec­tion. So you know, you could think about it what­ev­er way you want to.

Leading search engines give promi­nence to pop­u­lar, wealthy, and pow­er­ful sites—via the tech­ni­cal mech­a­nisms of crawl­ing, index­ing, and rank­ing algo­rithms, as well as through human-mediated trad­ing of promi­nence for a fee at the expense of oth­ers.
Nissenbaum, Helen, & Introna, Lucas. Shaping the Web: Why the Politics of Search Engines Matters [pre­sen­ta­tion slide]

Helen Nissenbaum I think tried to fore­warn us back in 2000, you know. She said the lead­ing search engines real­ly give promi­nence to the pop­u­lar, wealthy, and pow­er­ful sites. Really, her study was that those with more cap­i­tal are able to influ­ence what hap­pens kind of in the realm of com­mer­cial search. And of course this comes at an expense of oth­ers who are less well-capitalized. And so I think this is part of what I’m try­ing to think about in my work.

Now, here’s the part of the talk where you’re going to feel like you’re going to need to hit the wine bar after this, okay. So just bear with me and we’ll get through this part but I think this is real­ly at the epi­cen­ter of the rea­sons why I care about this top­ic.

When I first start­ed col­lect­ing search results in 2010, 2009, I was look­ing at a vari­ety of inter­sec­tion­al iden­ti­ties. So I was look­ing at black girls, Latina girls, Asian girls, Asian-American, American Indian, I was kind of look­ing at all kinds of dif­fer­ent com­bi­na­tions. I looked at boys, I looked at girls, I look at men, I looked at women. Really not try­ing to essen­tial­ize the iden­ti­ties but to think in terms of kind of the com­mon ways in which peo­ple are also engag­ing with iden­ti­ties, whether they’re their own or oth­ers’ iden­ti­ties. And I used most­ly the kind of cat­e­gories that’re in the cen­sus, because those are often­times the kind of eth­nic and racial terms that peo­ple get social­ized around.

And when I first start­ed this work in 2009, the first hit when you did a key­word search on black girls” was hot​black​pussy​.com. I’m going to say pussy” a bunch of times right now and then we’re gonna move for­ward.

So, this was con­cern­ing to me. This was kind of Google’s default set­tings, right. There was noth­ing par­tic­u­lar­ly spe­cial. And so I was engaged with oth­er grad­u­ate stu­dents and we were col­lect­ing search­es with dif­fer­ent IP address­es, off cam­pus net­work, with machines that pret­ty much the only dig­i­tal traces were that it came out of the box and we had to con­nect to a net­work. But real­ly, machines that didn’t have all of our dig­i­tal traces per­son­al­ly. Although I will say that I col­lect­ed search­es also on my own lap­top, and I’ve been writ­ing about and kind of cri­tiquing Google for a minute. And I still get messed-up results. So this idea that my own traces some­how would influ­ence— I mean, Google’s nev­er quite fig­ured out that I don’t want to see that yet, no mat­ter how many times I write about it or speak about it.

You can see here, by 2011 sug​ary​black​pussy​.com has tak­en over the head­line, fol­lowed by Black Girls, which is a UK band. Has any­one heard of the Black Girls? I mean, it’s like—one, okay. It’s a UK band of white guys. They call them­selves the Black Girls. They’re incred­i­ble at search engine opti­miza­tion and ter­ri­ble at musi­cal music dis­tri­b­u­tion. Just throw­ing that out there because no one’s ever heard about except for [inaudi­ble name].

Okay, so they’re #2 fol­lowed by Two black girls love cock,” this is a porn site; fol­lowed by anoth­er porn site; anoth­er kind of gate­way chat site to a porn site; fol­lowed by the Black Girls, our UK band again, their Facebook page win­ning in the SEO game; fol­lowed by a porn site, and then by a blog.

So I first wrote about this in 2012. I was real­ly inter­est­ed in this phe­nom­e­na, and I con­tact­ed Bitch magazine—some of you might be famil­iar with, they’re like a fem­i­nist mag­a­zine. It’s real­ly pop­u­lar in the Bay Area where I used to live. And they cri­tique kind of pop­u­lar cul­ture. They had a spe­cial issue out on cyber cul­ture. And you know, I didn’t have to heart to say nobody’s say­ing cyber cul­ture” any­more because it was already out with CFPs. So I just wrote to them and I said, You should let me write this sto­ry about what’s hap­pen­ing in search because this is real­ly impor­tant.”

And they wrote back and they said this is not a sto­ry because every­body knows when you search for girls online you’re gonna get porn.

And I was like aren’t we like, the fem­i­nist mag­a­zine. We don’t want to cri­tique it? Talk about it?

And they were like, No. This is a non-story.”

And I was like you know, it’s more com­pli­cat­ed than just what hap­pens specif­i­cal­ly to black girls, or Latina girls, or Asian girls. It’s also that girls, that women…all these sites are about women, but women are cod­ed as girls. This is just kind of a fun­da­men­tal Sexism 101. We could talk about that, too.

It’s not a sto­ry. They’re not inter­est­ed.

So final­ly I wrote back—because I’m relentless—and I said… This it just a tip for grad stu­dents, do that. Like, stay with it. I wrote to them and I said, I’d like you to do a search on women’s mag­a­zines, and just let me know if Bitch mag­a­zine shows up in the first five pages.

And then like ten min­utes lat­er— I was just visu­al­iz­ing that some­body got the email, and they print­ed it out and they walked around the office. And they were like, Can you believe—? Oh my god, maybe there is some­thing here.”

And then they wrote back and they were like okay, you can the sto­ry.

So I wrote the sto­ry, and one of the things I talked about was like, what does it means that the tra­di­tion­al play­ers, of course, Good Housekeeping, Vogue, Elle…you know, the big, well-capitalized mass media mag­a­zines were able to dom­i­nate and con­trol the word women.” And real­ly, unless you looked for fem­i­nist media,” Bitch mag­a­zine was not going to be avail­able to you.

And of course what does this mean? The read­er­ship of Bitch mag­a­zine is kind of like old­er high school-aged women, kind of into young adults. So this is a prime group of peo­ple for whom maybe a con­cept like fem­i­nism would be valu­able or inter­est­ing, but would be kind of unavail­able or inac­ces­si­ble in the ways that key­words were asso­ci­at­ed with par­tic­u­lar types of media. So this was one of the first places where I wrote about, and then I wrote some aca­d­e­m­ic things, and then I wrote a book about it.

Here we go with Latina and Asian girls. Again, these are all hyper­sex­u­al­ized kind of…Asian girls, porn and sex pic­tures and movies as our first. Going over here to Latina girls, we get a web­site that’s actu­al­ly match​.com but you can see it doesn’t have a yel­low box around it; it isn’t real­ly called out like an ad per se. Followed by Hot Latina Girls on Facebook and a whole series of sex­u­al­ized rep­re­sen­ta­tions of Asian and Latina girls.

Now, these ideas about the kind of hyper­sex­u­al­iza­tion of black women and girls, women of col­or, this is not new. This is not a new media phe­nom­e­non. These are in fact old media prac­tices. There’s a phe­nom­e­nal resource for you, par­tic­u­lar­ly if you’re teach­ing stu­dents who want to talk about the his­to­ry of racist and sex­ist rep­re­sen­ta­tion in the media and in pop­u­lar cul­ture. The Jim Crow Museum of Racist Memorabilia is real­ly an amaz­ing kind of online dig­i­tal col­lec­tion. It’s also a great col­lec­tion when you’re teach­ing about dig­i­tal col­lec­tions and you’re think­ing about dig­i­tal libraries.

The Jim Crow Museum real­ly start­ed as a col­lec­tion of what we would call racist mem­o­ra­bil­ia by a pro­fes­sor at Ferris State University and was all dig­i­tized. And what’s real­ly inter­est­ing about this col­lec­tion that gives us a long—you can see eas­i­ly a 200-year his­to­ry of what these kinds of sex­u­al­ized images does is it also gives us a counter-narrative about what they mean.

So, in the dom­i­nant nar­ra­tive of black women as Jezebels, as Sapphires, and these hyper­sex­u­al­ized stereo­types of black women, these are inven­tions that are used par­tic­u­lar­ly when the enslave­ment of African peo­ple becomes out­lawed in the United States and there has to be kind of a mass jus­ti­fi­ca­tion for the repro­duc­tion of the slave labor force. And so part of how that jus­ti­fi­ca­tion comes into exis­tence is by char­ac­ter­iz­ing black women as hyper­sex­u­al, as peo­ple who want to have a lot of babies, right, that can be born into enslave­ment.

And so this is some­thing that’s real­ly impor­tant. These stereo­types, they don’t just come out of thin air, and they’re not based in some type of nature or nat­ur­al pro­cliv­i­ty of black women. They’re actu­al­ly kind of racist, cap­i­tal­ist, sex­ist stereo­types that are used as part of a kind of eco­nom­ic sub­ju­ga­tion of black peo­ple and black women. And you can read there quite a bit about longer his­to­ries of European fas­ci­na­tions with black sex­u­al­i­ty and oth­er­wise.

But this is a high­ly, deeply com­mod­i­fied kind of stereo­type. And of course this is one of the rea­sons why it’s still present and preva­lent in the kinds of infor­ma­tion and medi­as­cape that we’re engag­ing with.

Here’s an image search. This is even back as far as 2014. Now, I have to say you know, 2014, and I’m kind of giv­ing you— One of the things that’s dif­fi­cult about doing research as all of you now who study the Internet is that the moment you study a par­tic­u­lar phe­nom­e­non that that hap­pens on the Internet, it changes. So I’d like to think in my own work that I’m kind of cap­tur­ing these arti­facts and then try­ing to talk about what these moments or what these arti­facts rep­re­sent, so that we can make sense of them. Because it’s like­ly that we’ll see them again.

Here’s an image search on black girls.” Again, kind of con­sis­tent with the tex­tu­al dis­cours­es in com­mer­cial search about black women. I real­ly thought in 2014, hon­est­ly, that Sasha and Malia Obama would show up kind of in the top. You know, they were super pop­u­lar. In 2009, we thought maybe Raven Simone, the last ves­tiges of her. We know why she’s out now. But you know, by 2009 we kind of thought maybe Raven would still be there, but she was gone, too.

And the link­ages— Of course, these images are con­nect­ed to sites that had a lot of hyper­links, a lot of cap­i­tal, that are con­nect­ed in a net­work of cap­i­tal­ized and well-trafficked kinds of images.

One of the things that’s inter­est­ing to me as a research ques­tion is how would black women and girls inter­vene upon this if the frame­work of prop­er­ty rights, own­er­ship, and capital…you know, those who have the most mon­ey to spend for exam­ple in AdWords, who are will­ing to pay more to have their kinds of con­tent and images rise to the top. How would black girls ever com­pete economically—or numer­i­cal­ly, quite frankly—against that, or in that kind of a com­mer­cial ecosys­tem? And again, who gets to con­trol the nar­ra­tive and the image of black women and girls has always been very impor­tant to me.

Audience Member: How would those images change if you searched for white girls?”

Noble: White girls are also sex­u­al­ized. I think you can see when you look over time kind of the hier­ar­chy of the racial order in the United States, in terms of more explic­it types of images are always more appar­ent for women of col­or. And this is very con­sis­tent with the porn indus­try. For exam­ple if you study pornog­ra­phy, which you know…proceed with cau­tion. It’s a depress­ing top­ic, quite frankly.

But if you read the porn stud­ies for exam­ple lit­er­a­ture, you find that in the porn indus­try white women do we would con­sid­er soft, less dan­ger­ous types of pornog­ra­phy, both kind of like emo­tion­al and phys­i­cal types, and the rep­re­sen­ta­tions of white women in pornog­ra­phy are not near­ly as explic­it as they start to become then for black women. Black women do the most dan­ger­ous types of pornog­ra­phy labor in that indus­try. And I think you see a map­ping of that.

Media rep­re­sen­ta­tions of peo­ple of col­or, par­tic­u­lar­ly African Americans, have been impli­cat­ed in his­tor­i­cal and con­tem­po­rary racial projects. Such projects use stereo­typ­ic images to influ­ence the redis­tri­b­u­tion of resources in ways that ben­e­fit dom­i­nant groups at the expense of oth­ers.
Davis, Jessica L., & Gandy, Oscar H. Racial Identity and Media Orientation: Exploring the Nature of Constraint

One of the things that’s also inter­est­ing when you search for white girls is that white girls don’t char­ac­ter­ize them­selves as white girls. They’re just girls. And so this is also anoth­er phe­nom­e­non, that the mark­ing of white­ness is actu­al­ly a phe­nom­e­non that hap­pens by peo­ple of col­or to name white­ness as a project and as a dis­tin­guish­ing fea­ture that peo­ple many times who are white do not embody or take on them­selves.

So, what else is impor­tant about Google and search? I think this is an impor­tant study. This was a cri­tique that was writ­ten about Epstein and Robertson, who did a great— You know when you show up in Forbes and peo­ple are hat­ing on your research that you’re prob­a­bly doing a good job, okay.

So, Epstein and Robertson did this amaz­ing study in 2013 where they argued that democ­ra­cy was at risk because search engine results could be manip­u­lat­ed with­out peo­ple know­ing it. They had a con­trolled study, and what they found is that if they gave voters…if they had them do a search on a polit­i­cal can­di­date and vot­ers saw pos­i­tive sto­ries about that can­di­date, they sig­naled they would vote for that per­son. And in that same con­trolled study, if they showed neg­a­tive sto­ries about a can­di­date, peo­ple said they would not vote for that per­son.

So, they argued from their study back in 2013 that democ­ra­cy was at risk because search engine rank­ings, par­tic­u­lar­ly get­ting to the first page, was an incred­i­ble prob­lem. Because of course we know from oth­er peo­ples’ research like Matthew Hindman who wrote an impor­tant book called The Myth of Digital Democracy that peo­ple who have large cam­paign financ­ing chests to draw from are able to make it into the pub­lic aware­ness, and they’re cer­tain­ly much more like­ly to make it to the first page of Google search and be able to con­trol the nar­ra­tive and the sto­ry about their can­di­dates because they have the cap­i­tal to do so. And they argued in their study that unreg­u­lat­ed search engines could pose a seri­ous threat to the demo­c­ra­t­ic sys­tem of gov­ern­ment, and they cer­tain­ly have been impor­tant play­ers in talk­ing about reg­u­la­tion of search engines.

Here we have Google’s top news link for final elec­tion results. This is the [inaudi­ble] fol­low­ing the pres­i­den­tial elec­tion. The first link was a sto­ry that led to infor­ma­tion here that Trump has in fact won the pop­u­lar vote. So we know that that is an alter­na­tive fact. That is not a real fact. That that did not hap­pen. That President Trump did not win the pop­u­lar vote. And yet this is the first hit, right, imme­di­ate­ly fol­low­ing.

And so, it’s been inter­est­ing to me to watch the con­ver­sa­tions over the last few months about biased infor­ma­tion, some peo­ple are call­ing that fake news, and an incred­i­ble empha­sis on Facebook. But not nec­es­sar­i­ly as much of an empha­sis on Google. And let’s not for­get about the incred­i­ble import that Google has. And one of the rea­sons why I’m so inter­est­ed in them is because they’ve real­ly come to be, in the pub­lic imag­i­na­tion, seen as a legit­i­mate, credible—you remem­ber back to the Pew study—fair, accu­rate site where peo­ple expect­ed that the infor­ma­tion they find they can trust. And so this again is some­thing that I think we have to be incred­i­bly cau­tious about.

I’m going to quick­ly skim over this because I want to get to a more impor­tant top­ic here before I close out, which is to say that Sergey Brin, one of the cofounders of Google has been asked many times about the manip­u­la­tion of search results. And here we had a sto­ry about white suprema­cists and Nazis, how they’ve hijacked par­tic­u­lar terms. You might be famil­iar with this, of course. For many years they’ve been able to manip­u­late and game Google around the word jews” and jew,” and how that could be tight­ly linked to Holocaust denial and white suprema­cist web sites.

When Sergey Brin is asked about adjust­ing the algo­rithm to kind of pre­vent the coop­ta­tion of dif­fer­ent kinds of words from white suprema­cists, I love his—it’s like I almost can­not keep from laugh­ing when I read this. He says that would be bad tech­nol­o­gy prac­tice. An impor­tant part of our val­ues as a com­pa­ny is that we don’t edit the search results,” Brin said. What our algo­rithms pro­duce, whether we like it or not, are the search results. I think peo­ple want to know we have unbi­ased search results.”

Except of course when we’re in France and Germany, where it’s against the law to traf­fic in anti­semitism, and then we ful­ly sup­press and curate white suprema­cists’ anti­se­mit­ic con­tent out.

So we have kind of these dif­fer­ent areas that hap­pen for the American US press, and then a dif­fer­ent nar­ra­tive that’s hap­pen­ing in France and Germany, where quite frankly many platforms—not just Google, Facebook, Tumblr (which is Yahoo!-owned), many of these plat­forms are strug­gling with try­ing to man­age the flow of dis­in­for­ma­tion. You have pub­lic offi­cials par­tic­u­lar­ly in Europe who are call­ing for an imme­di­ate stop to the kinds of dis­in­for­ma­tion and mis­in­for­ma­tion that are flow­ing through these plat­forms, with a recog­ni­tion that they have an incred­i­ble amount of harm that can be gen­er­at­ed from them.

Of course in the EU, par­tic­u­lar­ly in Germany, there’s such a height­ened aware­ness about the rela­tion­ship between hate speech and what led to the Holocaust. And so we have dif­fer­ent con­cep­tions about free­dom of speech than exist in oth­er parts of the world. And I think that maybe we could learn from oth­er places about that, but that’s again anoth­er top­ic for anoth­er day.

So the last piece I want to give you and then we’ll open up for some con­ver­sa­tion. Here we have the case of Dylan Storm” Roof. Many of you know Dylan Roof was a 21 year-old white nation­al­ist who opened fire on unsus­pect­ing African-American Christian wor­shipers at Emanuel African Methodist Episcopal Church in the sum­mer of 2015. I won’t go a lot into the back­sto­ry, but this is a site that’s not cho­sen in vain. This has been kind of a site of rad­i­cal resis­tance of white suprema­cy and of strug­gle. A site for the orga­niz­ing and strug­gle for civ­il rights and human rights and recog­ni­tion of African-American peo­ple, black peo­ple in the United States.

And so Dylan Roof, after the murders—immediately many researchers are turn­ing to the Web and try­ing to make sense of what’s hap­pen­ing here. And I wrote a whole chap­ter in the book about about this phe­nom­e­non. Within about twenty-four hours, some­one on Twitter found Dylan Roof’s kind of online diary at The Last Rhodesian,” and this was the part of his diary that jumped out to me. He says,

The event that tru­ly awak­ened me was the Trayvon Martin case. I kept hear­ing and see­ing his name, and even­tu­al­ly I decid­ed to look him up. I read the Wikipedia arti­cle and right away I was unable to under­stand what the big deal was. It was obvi­ous that Zimmerman was in the right. But more impor­tant­ly this prompt­ed me to type in the words black on White crime” into Google, and I have nev­er been the same since that day. The first web­site I came to was the Council of Conservative Citizens. There were pages upon pages of these bru­tal black on White mur­ders. I was in dis­be­lief. At this moment I real­ized that some­thing was very wrong. How could the news be blow­ing up the Trayvon Martin case while hun­dreds of these black on White mur­ders got ignored?

From this point I researched deep­er and found out what was hap­pen­ing in Europe. I saw that the same things were hap­pen­ing in England and France, and in all the oth­er Western European coun­tries. Again I found myself in dis­be­lief. As an American we are taught to accept liv­ing in the melt­ing pot, and black and oth­er minori­ties have just as much right to be here as we do, since we are all immi­grants. But Europe is the home­land of White peo­ple, and in many ways the sit­u­a­tion is even worse there. From here I found out about the Jewish prob­lem and oth­er issues fac­ing our race, and I can say today that I am com­plete­ly racial­ly aware.
Dylan Roff Manifesto, 2015 at www​.las​trhode​sian​.com [pre­sen­ta­tion slide; empha­sis from slide]

Now, one of the things that’s inter­est­ing is that when we try to repli­cate the search of black on white crime,” the Council of Conservative Citizens kind of came up again and again. The Council Conservative Citizens, accord­ing the Southern Poverty Law Center, is char­ac­ter­ized as vehe­ment­ly racist, alright. You could think of it as the online equiv­a­lent to the White Citizens’ Council, which was…when you were as racist as say the KKK, but you were say a may­or, or a judge, or an assem­bly mem­ber, you couldn’t real­ly be in the KKK but you could be in the White Citizens’ Council.

So the Council of Conservative Citizens is like a… You know, you were just in an inter­est group that cared about con­ser­v­a­tive val­ues and the inter­ests of white com­mu­ni­ties. But you weren’t like a night rid­er, a ter­ror­ist, rid­ing out and lynch­ing peo­ple, let’s say. Those were kind of the dis­tinc­tions between the White Citizens’ Council and the KKK. The Council of Conservative Citizens, if you look at their site is real­ly what Jessie Daniels calls a cloaked web site in her great book Cyber Racism.

So, what you don’t get when you do a search on black on white crime” for exam­ple, you don’t get any infor­ma­tion that tells you that this is a white suprema­cist red her­ring. That this is a phrase that’s used by white suprema­cists as an orga­niz­ing kind of moniker. You also don’t get FBI sta­tis­tics that show you that the major­i­ty of mur­ders hap­pen with­in com­mu­ni­ty. So while we’re very famil­iar with a phrase like black on black crime,” the truth is that most white Americans are killed by oth­er white Americans. So I guess we either have to take black on black crime out of our vocab­u­lar­ies or we have to add white on white crime, to make sense of these kinds of phe­nom­e­na.

You also don’t get access to any kind of black stud­ies lit­er­a­ture, or any schol­ar­ly kind of fram­ings of what does this mean and what do these kind of move­ments mean, how are they char­ac­ter­ized? Think back to the way the pub­lic talks about…in the Pew study. You think that what you get on the first page is fair and accu­rate, cred­i­ble and trust­wor­thy. And here you have Dylan Roof who’s engag­ing with Google, and maybe we could argue a sim­i­lar way.

I’ll just say that when I was look­ing for these kinds of stereo­types and try­ing to make sense of them, our field is not off the hook. I looked for black on white crime in fact in the UCLA library—this is when I was at UCLA.

And you know, it’s a chal­lenge. Because if you look for let’s say racist imagery” like this, very dif­fi­cult to find. In fact, when I was look­ing for it, here I am look­ing for black stereo­types” in Artstor, which is our largest dig­i­tal col­lec­tion of images in the library. Apparently there are only six black stereo­types avail­able, accord­ing to Artstor.

Now, we know that’s not true. So now I’m try­ing to think about well what’s the meta­da­ta? What’s hap­pen­ing in the way in which ideas are being char­ac­ter­ized. And I thought, well librarians…they’re youngish, maybe they say African-American instead of black. That must be it. That must be the dif­fer­ence.

Except that now we only have forty-two results, and in them we’re get­ting paint­ings, oils on canvases…here’s a pic­ture of by Theodore Kaufmann, who was a German painter, paint­ing post civ­il War. So that’s not exact­ly the kind of stereotype—I’m not even sure if that is a stereo­type.

So I thought well let me look on racism again. Here’s racism. In racism, we have images, screen­shots of a real­ly phe­nom­e­nal satir­i­cal web site that used to be up called Rent-a-Negro but is now down because Damali Ayo, who was the artist who cre­at­ed this web site right after President Obama was elect­ed— You know, this was like a satire about lib­er­al white peo­ple who were all like, I vot­ed for Obama because I have black friends,” and then they have to pro­duce the black friend at a din­ner par­ty and couldn’t? And so they could go rent a Negro. And so this was like a real­ly great, fun­ny web site about that par­tic­u­lar phe­nom­e­na of kind of lib­er­al white racism. But that work, which is real­ly clear­ly about anti-racist dis­course is cat­a­loged here under racism. It’s just inter­est­ing

Now I’m look­ing and I final­ly start to find some this racist mem­o­ra­bil­ia of the United States, and it’s char­ac­ter­ized kind of— You can find it under the words black his­to­ry [and] racism.”

Now, I find this inter­est­ing because some of my col­leagues and I might argue that if we had been the cat­a­logers, maybe we would have char­ac­ter­ized this as white his­to­ry. Again, a way of think­ing and inter­pret­ing with dif­fer­ent lens­es about what’s hap­pen­ing in this par­tic­u­lar phe­nom­e­non.

Black his­to­ry,” just on its own with­out racism” gives us back to good old Thomas Waterman Wood and his paint­ings. And so again we have a lot of work to do. We have this con­cep­tion that out there in the com­mer­cial search spaces it’s ter­ri­ble or we kind of know espe­cial­ly in our field. But I think we have to also inter­ro­gate our own infor­ma­tion sys­tems and the way in which these kinds of racial­ized dis­cours­es are pro­duced, repro­duced, and go unchal­lenged.

Some of the things I think we can do, I think one of the things we have to do is real­ly reject the idea that our field is neu­tral and that we’re sim­ply infor­ma­tion bro­kers. We have an incred­i­ble respon­si­bil­i­ty. You know, I often think what it be like if we cen­tered eth­nic stud­ies and gen­der stud­ies at the epi­cen­ter of infor­ma­tion schools, or at the epi­cen­ter of com­put­er sci­ence pro­grams. I tell my com­put­er sci­ence stu­dents, for exam­ple, who come and take class­es with me, which…you know, you can imag­ine how hard that is for them some­times? And they say, No one in four years has ever talked to me about the impli­ca­tions of the work that I’m doing. I haven’t real­ly ever thought about these things ever.”

And I say to them, You can’t design tech­nolo­gies for soci­ety and you don’t know any­thing about soci­ety.” I just don’t know how that’s pos­si­ble. So what would it mean if we re-centered the peo­ple rather than cen­ter­ing the tech­nol­o­gy and thought out from a dif­fer­ent lens?

So I think I’ll leave it there and give us a cou­ple of min­utes for ques­tions. Thank you.

Audience 1: I had a question in terms of how do you think the best approach is to combat these sort of algorithms? Because I know human moderation is one option, but also most of these algorithms are based off of frequency. And if things like pornography are the highest frequency things on the Internet, then how do you combat that in terms of [inaudible] your actual search?

Safiya Noble: I anticipated that question, so don't think I'm a weirdo that I just go right to the slide. So I kind of knew.

So, in the book I try to theorize a little bit around this. And one of the things that I do is I talk about what would it mean if we had different interfaces? Now, I don't think there is a technical solution to these problems. So I wan to say that first. I don't think you can technically mitigate against structural racism and sexism. I think we have to also mitigate at the level of society. But we also have to be mindful if we're in the epicenter of technology design, we should also be thinking about these issues.

So one of the things I do— Now, my parents were artists. So the color picker tool is actually a thing that's always been in my life for a really long time. So I thought about it as a metaphor for search. What would it mean if I put my black girls box in the red light district of the Internet? I know what the deal is, right. This is…again, like it's a counternarrative to ranking as a paradigm. Because here's the thing. The cultural context of ranking is that if you're number one… Like, they don't make a big foamy finger that's like "we're number one million three hundred…" Nobody does that. If your number one you're number one, that's what matters in the cultural context of the United States, for example, or in the West, or in lots of places. So the first thing I would do is break out of the ranking orientation. Because that has a cultural context of meaning.

If we start getting into something like this, now we actually have the possibility of knowing that everything that we're indexing is coming from a point of view. This is really hard to do, let me say by the way. I've been meeting, working with computer scientists. Super hard. We're theorizing and trying to get money to experiment with this. And I'm always open to collaborations around this.

But what this does is this gives us an opportunity. Immediately as a user we know like oh, there's many perspectives that might be happening around, or contexts within which a query could happen.

Also, what if we had opt in to the racism, the sexism, the homophobia? Because right now, for those of us who are in any kind of marginalized identity who are bombarded with that kind of content, which is many people, as the default, we don't get to unsee it. We don't get to be unaffected by it. So what if people actually had to choose that they wanted that? That also to me is a cognitive kind of processing. Rather than making it normative, we actually have to take responsibility for normalizing that in our own lives, in our own personal choices.

This is just one metaphor for how we could also reconceptualize the information environment. Again, this isn't new. But it's more like keeping at the forefront that the context within which we're looking for things always has a particular bias. What I'm trying to do here is say there will always be a bias. Maybe there's a commercial bias. Maybe there's a pornographic… Maybe there's an entertainment bias. Maybe there's a government, kind of noncommercial… Maybe we're in the realm of blogs and personal opinion and we know that. That's different than doing a search on "black on white crime" and thinking that the first thing you get has been vetted by a multinational corporation that you trust, with a brand you trust. The Council of Conservative Citizens; that's what I mean by it.

Audience 2: So I sort of…so to follow up on that a little because I think that notion of the ranking is kind of an interesting thing in here, right. Because, I mean I think people forget, right, before Google there were other search engines that often did not have the notion of taking you directly to the thing, but instead had the notion of showing you the range of stuff.

Noble: Yes. Totally.

Audience 2: So one part is the list as output. One part is the whole notion of the like, "I feel lucky" button; there is one result and I can take you directly to it. And so it may be that also one of the opportunities here to sort of recast things is recasting what one gets back as deliberately a wide trawl, and it's like well you know, there's ten things over here, and there's a hundred things over there, and there's a thousand things over there. And and that's the spread. So it might be not just the input but the output.

But the other thing I sort of want to think about a little bit with you is how we better give people a sense of the coupling between search and results, right. It's not just that there's a world of things out there, but obviously that it's statistically affected by search patterns. That is, these things are attempting to respond to what it is that people are looking for, one way or another.

n. Yeah. I mean, the visualization dimensions of information to me are so important to the future of making sense of nonsense, too. I could talk for a while with you about this. Because I think the other dimension kind of of the mathematical formulation, that's not present. I mean, that would be quite fascinating if you had a loading bar that was showing you a calculation is running that was legible also to say an eight-grader. Maybe it was in text and not in kind of a statistical language. Because you know, those are also shortcuts and representations of ideas. And yet they're framed I think as like a truth, a mathematical truth. And that's also quite difficult to dislodge.

So I think that's happening. I think that making those kinds of things visible is really important. Dislodging the idea even that there is a truth that can be pointed to. I mean, one of the things that is challenging that I'm always trying to share with my students is that they are acculturated to doing a search and believing there is an answer that can be made found in .03 seconds is deeply problematic. Because we're in a university where we have domains, and departments, and disciplines who have been battling over ideas and interpretations of an idea for thousands of years. And so now, that flies in the face of the actual thing we're doing here in a university.

And so acculturating people to the idea that there's an instant result, that a truth can be known quickly, and of course that it's obscured by some type of formulation that is truthful, right, that's leading them. A statistical process that's leading them to something that's more legitimate than something else, again gets us back into these ideas of who has more voice and more power, more coupling, more visibility online than others. And this goes back to my black girls who don't have the capital for numbers to ever break through that. It's hard. But these are the questions, for sure.

Audience 3: Thanks for the wonderful presentation. I really really appreciated it. So I have a question about the racist images button. I know that we're not supposed to focus on the slide but I think it's really awesome, this apparatus that you're suggesting is kind of really exciting.

But I was just wondering like, the idea of a racist images filter, going back to your work on library search engines, I feel like that is part of the problem, right. Like it's this idea when machines and algorithms [inaudible] communities of people trying to find something like racism, we get into this sort of push and pull where there are very very very almost comically wrong answers. So I was what your thoughts are on that filter. Because I know there must be some deeper thoughts going on there.

Noble: Yeah. I mean, here's where I think about the work of my colleague Sarah Roberts around content moderators. That this is a laborious practice. Who decides what's racist? I was on a panel yesterday with a woman who used to be a content moderator for MySpace back in the day. And she's like, "You know, the team was me," she's a black woman, "a gang banger, like a Latin King; a white supremacist; a woman who was a witch…" You know what I'm saying? She's like, that was the early MySpace content moderation team in Santa Monica. She's like, "And also we were drunk and high because it was so painful looking at the kind of heinous stuff that comes through the Web and curating it out of MySpace."

Coupled with this, obviously, has to be a making visible of the labor that is involved in curating taste and making sense of where do we bottom out in humanity in terms of what gets on to the platforms that we're engaging with. I mean there's armies of people who are curating all kinds of things out. The most disgusting things you could ever imagine. Ras was saying yesterday, she said, "I couldn't shake hands with anybody for three years because I couldn't believe what human beings were capable of," alright. So there was a great content moderation conference at UCLA yesterday for the last couple days.

You would have to also make people legible and visible, this type of work visible. Because machines cannot do that tastemaking work. Not yet, and probably not for a very long time. We're nowhere near that. So I think that's also what has to happen is…again in Rent-a-Negro, that someone who didn't get it thought that was racist. But the black people who would have seen it would've been like, "That is actually hella funny." See what I'm saying? So those are sensibilities that we can't ignore, that can't be automated. That're also kind of political, quite frankly. And that certainly has to be part of how we reimagine the technologies that we're designing and engaging with. I don't think you can automate that. I think you absolutely have to have experts.

One of the things I say to the media all the time when they ask me me about changing Silicon Valley and the stuff that comes out of it is I say you know, pair people who have PhDs and graduate degrees in black studies with the engineering team. Or PhDs in gender studies. Or American studies or something, right. People…humanists and social scientists. Because they have different kinds of expertise that might help us nuance some of these ideas and categorize. We don't do that. And this is such…it's…you know, it's really difficult in our information schools when we don't put these ideas at the center of thinking about curating human information needs. They're way off in the margin right now.

Audience 4: So, like you said, technology can't do really…much of the taste-making. They can't really classify, right.

Noble: Yes.

Audience 4: So, the racist, misogynistic content that we see on search engines is pretty much a manifestation of how the Internet society thinks, right. Don't you think they are complicit in perpetuating these stereotypes? Because Google only aggregates what's popular, what's frequent.

Noble: Yeah, but Google also prioritize what it will make money off of. So there's also a relationship between people who pay it through AdWords, in particular, to optimize. And SEO companies who will pay top dollar to have certain types of content circulate more than other types of content. And so this is where to be like, "Well Google's like…" You know, if Google didn't make any money off of trafficking content? We could maybe argue that it's just like Sergey Brin says, it's just the algorithm producing the algorithm's results. Except it makes money off of the things that will be the most titillating, that will go viral, that people will click on. And it couples ads in direct relationship to its "organic results."

So there's no way… I mean, it's making money on both sides of the thin line on the page. So I think that it's disingenuous to say that Google's not implicated in the kinds of results that we get. I think there's a lot of research by others, too, that shows that they will always kind of propagate what they make money from first, alright. So again, this is where people who love Google hate me. Because I'm just going to say they're also implicated in it. At a minimum, to what effect… You know, when you talk to people who— When you think about YouTube. You know, the beheadings in Syria, out, screened out by content moderators. Blackface? In. Why? Who decides? What are the values? Those are what at play in terms of it's the way it's implicated, and the decisions that they make about acceptability. The acceptability of racism or misrepresentations of certain groups but not of others.

Help Support Open Transcripts

If you found this useful or interesting, please consider supporting the project monthly at Patreon or once via Square Cash, or even just sharing the link. Thanks.