Thanks, every­one. It’s a plea­sure to be here today. I’m going to set my timer here so that I don’t run over, because that would be a bad thing. Alright. I don’t know if I need to do any­thing to turn myself on here. That sound­ed bad. But I actu­al­ly mean my slides.

I’m going to talk with you tonight about the cor­po­rate con­trol of infor­ma­tion and why we should care. And I’m going to try to do that as quick­ly as pos­si­ble. This is real­ly a talk that deserves more time, and I’m so excit­ed to be on this dis­tin­guished pan­el with these incred­i­ble mak­ers and researchers as well.

The thing that is so impor­tant to me and why I study cor­po­rate con­trol over infor­ma­tion, specif­i­cal­ly Google, is because in many ways, the pub­lic is increas­ing­ly reliant upon corporate-provided infor­ma­tion. And those con­texts for that pro­vi­sion often hap­pen by way of alleged neu­tral­i­ty or objec­tive kinds of truth that might be pro­vid­ed by these kinds of platforms.

Specifically, I study Google because Google real­ly dom­i­nates the mar­ket. It’s not just a mar­ket leader, it holds a monop­oly in the search space. And when you think about it, if you’re look­ing for infor­ma­tion on the Web and you don’t know a spe­cif­ic URL, you are required to go through a bro­ker, so to speak, or some­thing to help you find that kind of infor­ma­tion that you might be look­ing for. Those bro­kers are search engines. And search engines, we relate to them in so many ways. If you look at the lat­est research, for exam­ple, it shows that the major­i­ty of the pub­lic in the United States believes that search engines are fair and unbi­ased sources of infor­ma­tion. They also think that the kinds of results that they get back (and you can look at peer research) will tell you that peo­ple feel that this kind of infor­ma­tion that they find in com­mer­cial search engines like Google is cred­i­ble and trust­wor­thy. So this to me means that we have to pay close atten­tion to what’s hap­pen­ing in those spaces. 

Now, some of you might be famil­iar with this cam­paign. This is one of the more pop­u­lar cam­paigns that has come about to try to prob­lema­tize some of the kinds of things that we find in search engines. This is a cam­paign that was spon­sored by the United Nations. Ogilvy & Mather Dubai (that’s a large adver­tis­ing agency) did a cam­paign. They did search­es on var­i­ous women of col­or, in var­i­ous coun­tries, to see what kinds of things Google would pop­u­late as an auto-suggest. What they found, for exam­ple, were things like:

  • women can­not dri­ve, be bish­ops, be trust­ed, speak in church
  • Women should­n’t have rights, vote, work, box
  • Women should stay at home, be slaves, be in the kitchen, not speak in church
  • Women need to be put in their places, know their place, be con­trolled, be disciplined

This was an inter­est­ing and fair­ly effec­tive cam­paign in try­ing to point out that soci­ety holds, still, a whole lot of sex­ist, patri­ar­chal val­ues around women. But what the cam­paign failed to do was real­ly to con­tex­tu­al­ize why these kinds of results come up. It left most read­ers of the cam­paign believ­ing that search engines are sim­ply neu­tral, they’re sim­ply pro­vid­ing the results that are most pop­u­lar or most searched on.

But I have found in my research, I’ve done quite a bit of research on col­lect­ing search­es, specif­i­cal­ly on women and girls of col­or. What you find is that when you start to click on these auto-suggestions, they’re linked to sites that are incred­i­bly prof­itable for Google. For exam­ple, they might be linked to sites that are heav­i­ly pop­u­lat­ed by key­words that are used through Google’s AdWords pro­gram. So blogs or web sites that heav­i­ly use AdWords. 

So Google makes mon­ey on the kinds of traf­fic that hap­pens in rela­tion­ship to some of these search­es. And I think this is one of the fail­ings of a cam­paign to help us under­stand and make sense of this com­mer­cial aspect, what the cor­po­ra­ti­za­tion or the com­mer­cial aspects of infor­ma­tion mean to an adver­tis­ing com­pa­ny like Google. I often try to tell peo­ple that Google is not pro­vid­ing infor­ma­tion retrieval algo­rithms, it’s pro­vid­ing adver­tis­ing algo­rithms. And that is a very impor­tant dis­tinc­tion when we think about what kind of infor­ma­tion is avail­able in these corporate-controlled spaces.

Here’s anoth­er exam­ple. Some of you might have seen this in the sum­mer. A very well-known Black Lives Matter activist, Deray McKesson, tweeted:

What was hap­pen­ing in Google Maps this sum­mer, many peo­ple noticed, is that if you were to go into Google Maps and search on n‑word house” or n‑word king” Google Maps would take you to the White House. The way that this was report­ed on by the media, The Washington Post pick­ing it up first and then oth­ers fol­low­ing, is that there must be some type of glitch in the sys­tem. And when we talk about these kinds of racist expe­ri­ences and point­ers that hap­pen in tech­ni­cal sys­tems, we also hear in the pub­lic dis­course these things talked about, again, as anom­alies, as glitch­es, rather than help­ing us to under­stand and unveil the ways that pro­gram­mers are peo­ple who write, and code is a lan­guage. And all lan­guages are value-laden, includ­ing bina­ry code languages. 

What this also tells us, even though Google gave a kind of non-apology, one of those we apol­o­gize for any offense this may have caused.” I can tell you that when my hus­band says some­thing like, I’m sor­ry if you’re offend­ed,” that is not actu­al­ly an apol­o­gy. So the non-apology apol­o­gy often comes for­ward from Google in these cas­es where they might even go so far as to issue a dis­claimer about the kinds of prob­lem­at­ic search results that come back. But again, the onus is typ­i­cal­ly placed back on the user. That some­how you searched incor­rect­ly, maybe you should’ve used dif­fer­ent words. Never, again, point­ing to the host of deci­sions, algo­rith­mi­cal­ly, that are made to get us to these kinds of point­ers. And this is real­ly important.

You can’t see this. I don’t know why it’s been blocked out. Sorry the image did­n’t come for­ward, but I’ll tell you what hap­pened here. You can see the main hit. I start­ed col­lect­ing search­es, for exam­ple on the words black girls,” Latina girls,” Asian-American girls”, South Asian girls,” indige­nous girls,” back in 2009. In 2009 the first hit when you did a search for black girls,” and of course I was moti­vat­ed par­tial­ly because a col­league of mine, Andre Brock had kind of mentioned—we were talk­ing about Google and some of the prob­lem­at­ics. And he said, Oh yeah, you should see what hap­pens when you Google black girls.’ ” And I was like, What? I’m a black girl. What hap­pens when you search for black girls?’ ” And of course I have six nieces and a daugh­ter. And what you would have seen here would be a list of high­ly porno­graph­ic and sex­u­al results. The first hit in 2009 was hot​black​pussy​.com, by 2011 that site had gone under and sug​ary​black​pussy​.com real­ly dom­i­nat­ed the land­scape as the first hit when you looked for black girls.”

Now let me say again that where the bias is is that you did­n’t have to search for black girls and sex, or black girls and porn. Black girls metaphor­i­cal­ly meant sex and porn, as did Latinas and Asian girls, and so forth. White girls did­n’t fair too much bet­ter, [they] also were sex­u­al­ized. And then the term girls” real­ly being coopt­ed in a very Sexism 101 way because all of these sites, as you clicked down them, were not girls. They were not chil­dren, they were not ado­les­cents, they were women, grown women. And so we have to look at these exam­ples and say, again, what does it mean? 

I first tried to write about this for Essence mag­a­zine, which is a mag­a­zine that focus­es on women of col­or, and they would­n’t have it because, who are you? I don’t know who you are. You can’t write for a major news out­let when you’re vir­tu­al­ly unknown. So I thought I’d write this arti­cle for the pub­lic and I con­tact­ed Bitch mag­a­zine. They’re a pro­gres­sive fem­i­nist mag­a­zine that cri­tiques soci­ety and cul­ture. And I could­n’t con­vince them to let me write this arti­cle. They were like, Everybody knows when you search for girls you’re going to get porn.” And I was like, Do they?”

So I said this is prob­lem­at­ic, how girls become stand-ins for pornog­ra­phy, and who and how that kind of sex­u­al­iza­tion. And you can see a map­ping, if you look in the porn stud­ies lit­er­a­ture, of the racial hier­ar­chy as well. This kind of more vio­lent forms of pornog­ra­phy and sex­u­al­iza­tion as you go through a racial order in the United States. Eventually, I told Bitch after about ten emails, Listen, why don’t you do a search for wom­en’s mag­a­zines’ and then let me know if you find Bitch in the first five pages.” And then I got the sto­ry.

So the thing is that they under­stood, final­ly, that the con­cept of fem­i­nism had been divorced from women. And so it’s real­ly impor­tant to talk about con­cepts, how con­cepts get framed algo­rith­mi­cal­ly. This is a real­ly impor­tant part of my work, and I’ve writ­ten about this. I’ve actu­al­ly got a book now that’ll be forth­com­ing next year called Algorithms of Oppression, and it’s real­ly to kind of elu­ci­date how these process­es happen. 

It’s not just a mat­ter of rep­re­sen­ta­tion and mis­rep­re­sen­ta­tion, porni­fi­ca­tion, because that’s incred­i­bly impor­tant. But there are oth­er nuanced ways in which algo­rith­mic bias is hap­pen­ing. Here you have an arti­cle in Forbes, a neg­a­tive arti­cle that was writ­ten against a study by Epstein and Robertson, where they found that vot­er pref­er­ences could be manip­u­lat­ed quite eas­i­ly based on the kinds of results that showed up on a first page of search.

So if neg­a­tive sto­ries about a can­di­date, espe­cial­ly at the local lev­el, cir­cu­lat­ed on the first page of search, peo­ple vot­ed against them. If pos­i­tive sto­ries cir­cu­lat­ed on the first page, peo­ple vot­ed for them. And those things are high­ly manip­u­la­ble. One of the things that Matthew Hindman for exam­ple wrote about in his book The Myth of Digital Democracy is this notion that what we find in these online news envi­ron­ments in par­tic­u­lar is just a mat­ter of unbi­ased free flow of infor­ma­tion. He found in his research study­ing elec­tions that peo­ple who had the most mon­ey were able to influ­ence what showed up in the first pages. So this political-economic cri­tique of what happens.

The first page of search is so incred­i­bly impor­tant, because the major­i­ty of peo­ple don’t go past that. So what hap­pens there is high­ly con­test­ed and some­thing that we must pay close atten­tion to.

This is one of the last things I want to share, that I recent­ly wrote about. Again, to talk about and elu­ci­date the way that con­cepts get formed and knowl­edge gets cre­at­ed and knowl­edge bias­es happen. 

Dylan Storm” Roof is an avowed white suprema­cist who opened fire on a church in South Carolina this sum­mer and mur­dered in cold blood nine African-American wor­shipers. This is an excerpt from his man­i­festo that was found online. I want­ed to draw atten­tion to a cou­ple of things that are real­ly impor­tant. He says, 

The event that tru­ly awak­ened me was the Trayvon Martin case. I kept hear­ing and see­ing his name and even­tu­al­ly I decid­ed to look him up. I read the Wikipedia arti­cle and right away I was unable to under­stand what the big deal was. It was obvi­ous that Zimmerman was in the right. But more impor­tant­ly this prompt­ed me to type in the words black on White crime” into Google, and I have nev­er been the same since that day. The first web­site I came to was the Council of Conservative Citizens. There were pages upon pages of these bru­tal black on White mur­ders. I was in disbelief. 

And then he goes on to talk about how could this be hap­pen­ing. So he talks about research­ing even more and this affirm­ing his com­mit­ment to white suprema­cy. And he says through all this research, which we can gath­er much of this hap­pened online through Wikipedia and Google, he says,

From here I found out about the Jewish prob­lem and oth­er issues fac­ing our race, and I can say today that I am com­plete­ly racial­ly aware. 

Now, it’s not far-fetched to know and think that many peo­ple are com­ing into their var­i­ous forms of con­scious­ness, not just racial con­scious­ness, through the use of these kinds of plat­forms. What does­n’t hap­pen when you go, and for exam­ple, if you look at the Council of Conservative Citizens, the Council of Conservative Citizens is a cloaked web site. Jessie Daniels writes about cloaked web sites, web sites that pre­tend to be a neu­tral kind of media, or [an] objec­tive site, but are in fact doing some­thing dif­fer­ent. The Council of Conservative Citizens has a web site that just looks like a con­ser­v­a­tive media aggre­ga­tor that’s feed­ing out news. But it’s actu­al­ly a very well-documented white suprema­cist orga­ni­za­tion. It’s like the busi­ness­man­’s KKK and has been for a long time. 

What Dylan Roof did­n’t get when you do search­es like black on white crime” is you don’t get a coun­ter­point, for exam­ple, that says there is no such thing as black on white crime.” You don’t get FBI sta­tis­tics that dis­prove the con­cept of black on white crime. You don’t get infor­ma­tion from black stud­ies schol­ars, for exam­ple, that might talk about what a fram­ing of a ques­tion like black on white crime even means in the con­text of con­tem­po­rary American soci­ety. So these are some of the things that I think I’m doing in my research to try to, again, make us aware of the crit­i­cal impor­tance of what these corporate-controlled infor­ma­tion envi­ron­ments are about.

And I would just say that Big Data tech­nol­o­gy bias­es, they don’t just end in our first-world, US, Western, Global North con­text. Much of my work for the past few years has been about Google and mis­rep­re­sen­ta­tion, cer­tain­ly. But if you want to extend the kinds of bias­es that hap­pen in terms of polit­i­cal and eco­nom­ic pol­i­cy, you can see the num­ber of com­pa­nies that are impli­cat­ed in things like the extrac­tion and min­ing indus­tries, where you have some of the worst polit­i­cal, sex­u­al vio­lence hap­pen­ing in direct rela­tion­ship to the extrac­tion of min­er­als that we need for our elec­tron­ics, for our tech­nolo­gies. Again, these things are hid­den from view.

We also don’t see the incred­i­ble e‑waste, the e‑waste cities that are pop­ping up along the West coast of Africa in Ghana. Incredibly tox­ic sit­u­a­tions where peo­ple are lit­er­al­ly exchang­ing their lives, in many ways, in the extrac­tion and in the waste indus­tries, hid­den from view.

So we have to think again about in what ways are our fetish­es around tech­nol­o­gy impli­cat­ed in these kinds of more inhu­mane sit­u­a­tions for oth­er peo­ple around the world. 

I’ll leave it there, and I’ll look for­ward to talk­ing with you dur­ing the Q&A.

Further Reference

Later, there were a pan­el dis­cus­sion and Q&A ses­sion.

Biased Data: A Panel Discussion on Intersectionality and Internet Ethics at the Processing Foundation web site.