Zeynep Tufekci: Hi, every­one. It is a great plea­sure to be here. Thank you for invit­ing me. And I want to talk about two things. I want to talk about inclu­sion and cen­tral­iza­tion, and exclu­sion and decen­tral­iza­tion. So, a lot of times these con­ver­sa­tions revolve around Facebook. And it’s not just because Facebook is the biggest plat­form there is for these things. But it real­ly is the best exam­ple of sort of these dual trends.

To give a lit­tle bit of back­ground on how impor­tant Facebook is, I’d urge you, espe­cial­ly if you’re tech­ni­cal­ly com­pe­tent and you run your own Linux and you’re very good at mul­ti­ple plat­forms and dif­fer­ent things, to step out­side that box and think of inclu­sion glob­al­ly. Where there are many people—billions of peo­ple, in fact—for whom tasks that might seem intu­itive and obvi­ous to you are not. On the tech­ni­cal­ly more com­pe­tent, or peo­ple who’ve been using these plat­form for a longer time, or com­put­ers for a longer time, we tend to under­es­ti­mate what it means to have some­thing usable. I work with peo­ple around the world all the time, and a lot of times even the simplest—you know, where is the search bar ver­sus the URL bar… Those are con­fus­ing.

So, a cou­ple of major plat­forms like Facebook and Twitter, YouTube, have become in many places around the world a de fac­to pub­lic sphere. Especially in coun­tries that have less than free Internet, less than free mass media. And these coun­tries have tran­si­tioned from a very con­trolled pub­lic sphere to a commercially-run one like Facebook, but [one] open to most peo­ple in the space of a few years. So that’s the con­text. Once again, I wor­ry about these things not just in the United States or the United Kingdom where there are a lot of issues with the mass media sys­tem, but it’s noth­ing like a place like Turkey, which I’m from. Or Burma, which has tran­si­tioned into a more demo­c­ra­t­ic state in just a cou­ple of years, which I’ll use as an exam­ple. So these are the kind of places we’re talk­ing about.

So to explain this con­flict, I’m going to talk about an exam­ple from the United States that hap­pened a cou­ple of years ago that shows how the inclusion/exclusion dynam­ics work. Now, on Facebook as you might have heard, it’s a com­mon dis­cus­sion— There is a dynam­ic called net­work effects;” some­times peo­ple call it net­work exter­nal­i­ties.” And these refer to the idea that if you have say, a fax machine and you want to fax peo­ple, some­thing that’s com­pat­i­ble is the one you’re going to use. So if you want to mes­sage peo­ple, you’re going to mes­sage them on the plat­form they’re on—if they’re on Facebook you’re going to mes­sage them on Facebook. So that has real­ly led to a huge amount of growth.

But that’s not the only thing. It’s also quite usable and easy to access. But it’s a com­mer­cial plat­form. It’s the pub­lic sphere, but it’s run to sell ads. Which means that there are all sorts of deci­sions that go into it that medi­ate what you get to see. And in par­tic­u­lar on Facebook, it’s run by the news updates that you see from your friends, from your fam­i­ly, from acquain­tances, from pages you fol­low. They are pri­or­i­tized and picked by an algo­rithm that Facebook sets, often changes, and decides what to do with. And this algo­rithm of course is meant to increase—as far as they’re telling us, and I believe it—to increase engage­ment” on the site, which means to keep you there. A lot of con­se­quences.

So I’m going to show you guys a pic­ture. On August 13, 2014, I was on Twitter one qui­et evening, and I was look­ing at news com­ing from a small protest in Ferguson, Missouri in the United States. For the year before that, or two years before that, there had been a lot of social media noise vol­un­tar­i­ly made by peo­ple who were try­ing to ini­ti­ate a move­ment which has since—we call it Black Lives Matter, but at the begin­ning it was kind of just peo­ple dis­cussing var­i­ous cas­es that weren’t get­ting a lot atten­tion. And one of the first ones I noticed was the killing of Trayvon Martin in Florida, where the killer was­n’t even tried in court, and there were a lot of pub­lic­i­ty efforts around that on social media for weeks. And then it got some mass media, and there was a tri­al, even though there was­n’t con­vic­tion.

So on August 13 about a year after that, there was anoth­er news, a very sad one. An African American teenag­er had been shot and killed in Ferguson, Missouri. And there had just been tor­na­does in the area a cou­ple of days before. And a cou­ple of jour­nal­ists were near­by, so they went.

So in Ferguson, this com­mu­ni­ty was angry that anoth­er young man had been shot. And his body had been just left there for hours. They were upset, they were griev­ing. And they start­ed hav­ing these small-scale protests. And this is kin­da like sub­ur­ban US, right.

Tweets, Zeynep Tufecki, August 13, 2014, at 6:13 PMand 8:27 PM

So, the jour­nal­ists go there. There are these protests. And this is what we start­ed see­ing on Twitter. This is like armored vehi­cles, snipers. I put up my tweets but there are a cou­ple of these jour­nal­ists, and they’re just sort of tak­ing these pic­tures. And I start­ed retweet­ing, as you can see. It was­n’t much but I was like, What is going on?”

I have a lot of friends around the world, espe­cial­ly in a lot of more repres­sive countries,and they were like, Zeynep, are those pic­tures from Bahrain or some­place else? What’s going on?” And my friends in Egypt were talk­ing about this. And there was this huge amount of dis­cus­sion of this protest on my Twitter time­line, which is not algo­rith­mi­cal­ly medi­at­ed. It’s just chrono­log­i­cal.

So I went over to Facebook, where I have a lot of friends. And I said alright, what are my friends there see­ing? What is going on? They weren’t talk­ing about it. Okay…maybe my friends on Facebook don’t care about this.

I went back to Twitter, and I saw that all my friends, it seemed, were talk­ing about it. It was just com­plete­ly con­sum­ing the con­ver­sa­tion. You know how you have some big news event, or Olympics or some­thing, and every­body’s talk­ing about it? That’s how it felt. I’d go to Facebook…nobody cares. Maybe I have the wrong kind of friends on Facebook, you know? You just won­der. You go to Twitter, it’s all there.

But I don’t real­ly have dif­fer­ent friends on Facebook. I have more fam­i­ly, you know, acquain­tances. So I switched my Facebook feed to chrono­log­i­cal. Which Facebook does­n’t want to let you do, because it wants to con­trol that feed, which is your eye­balls, which is your atten­tion, which in this day and age of atten­tion being the key resource is cru­cial. It wants to keep you there, it wants to con­trol that.

When I switched to chrono­log­i­cal, I saw that my friends were talk­ing about it, but Facebook was­n’t show­ing it to me. Back to Twitter, it’s all Ferguson. Switch to Facebook, if I force it chrono­log­i­cal it’s Ferguson, Ferguson. But boom, it keeps com­ing back. Because Facebook switch­es it back to the out group.

So I said wait, what’s going on? And I start­ed ask­ing around on Twitter, which is where I could ask. There were hun­dreds, maybe thou­sands of posts, of peo­ple who were com­plain­ing about this. They were like, Why isn’t…why aren’t I see­ing this on Facebook? Why is this huge news not there?” So I just kept going back and forth for the evening.

So what hap­pened was that instead of show­ing us this news, Facebook was show­ing us the Ice Bucket Challenge of August 2014. Do you remem­ber that? It was a good thing. People were dump­ing buck­ets of ice water on them­selves and chal­leng­ing each oth­er to donate to ALS char­i­ty. And they were doing this by tag­ging peo­ple and record­ing video. These are all things Facebook’s algo­rithm likes. Especially—it’s changed a lit­tle bit since then, but at the time you could only Like some­thing. Right now you can do a heart, or you can do tears, which are a lit­tle more range of options. How do you Like the one on the left side? How do you like that? It’s not a lik­able sto­ry.

I saw this again and again. I work on refugee sup­port, and I come across a lot of heart­break­ing pieces of news and media. If I post them, they don’t real­ly get around—they’re not lik­able. The switch to heart and cries has helped. Which shows you how impor­tant these deci­sions are.

So lat­er, because of anoth­er con­tro­ver­sy on Facebook trend­ing top­ics, which is dif­fer­ent than what I’m talk­ing about, News Feed, they pub­lished some inter­nal guide­lines with some demo tools. And by coin­ci­dence, they hap­pened to have a shot of that week’s data— Not by coin­ci­dence. Obviously that was some­thing they were wor­ried about, too, because it got atten­tion. And it showed yes, ALS Ice Bucket Challenge had drowned out the Ferguson news.

About three to four mil­lion tweets were sent before either mass media or Facebook’s algo­rithm would acknowl­edge what was going on. This real­ly makes me won­der have we ever had a choke point of cen­tral­iza­tion that con­trolled what a bil­lion and a half peo­ple might see in this round­about way, in which they don’t nec­es­sar­i­ly con­trol what you’re going to see but they con­trol the game on which we play to see what you will see. They set the rules. And their rules at the moment are engage­ment. Which means also viral­i­ty. Which means we have stud­ies show­ing now (It’s hard to study because you don’t even see it. Facebook has the data.) that fake news, out­ra­geous news, angry claims—or very heart-warming stories—are what goes viral.

Now, this you’re going to say is human nature, peo­ple like this. This is very true. People like these things. People like to feel bet­ter. They seek views them­selves. But we have a feed­back loop in which what peo­ple are kind of lean­ing towards is shoved at them very fast.

So with Ferguson, what had hap­pened was also a feed­back loop. Because it was hard to Like the Ferguson news, the algo­rithm showed it to few­er peo­ple, which led to few­er expo­sures, which led to few­er peo­ple being able to com­ment on it, which led to this algo­rith­mic spi­ral of silence.

We have the oth­er things. We have for example—right now in the United States I see this every day, false sto­ries about preva­lence of vote fraud are going viral on Facebook all the time. Because they’re out­ra­geous. They’re false, but they’re out­ra­geous. And this is fuel­ing peo­ple orga­niz­ing themselves—vigilantes—to show up at the polls to mon­i­tor” in inner cities.” Meaning they are going to go harass black peo­ple try­ing to vote. This is 2016.

Now, you might think you know, this is kin­da exag­ger­at­ed. Is this real­ly going to hap­pen? In Burma, we tran­si­tioned from a very closed pub­lic sphere in which the mil­i­tary con­trolled every­thing to one of the fastest dig­i­tal expan­sions. Everybody has some sort of fea­ture phone or smart­phone and some kind of access.

We have had Facebook posts— Facebook is the pub­lic sphere. We have had instances of out­ra­geous, hor­ri­ble, obvi­ous­ly false claims of the grow­ing Muslim minor­i­ty appar­ent­ly do— You know, it’s the kind of things you see in eth­nic cleans­ing sit­u­a­tions. Claims they kill babies or they do these hor­ri­ble things, that have led to these kinds of hate speech going viral in a con­text of eth­nic cleav­ages where we don’t have inter­me­di­ary insti­tu­tions and hun­dreds of years of state-building. And they have led to hun­dreds of deaths in the past few years. And the biggest refugee out­flow in the region. And maybe third or fourth in the world right now, because there are so many refugee out­flows. This is…I mean, I know, open and con­nect­ed and inclu­sive sounds great. But it comes with all these things. It comes with Facebook’s abil­i­ty to both choke a piece of news. But it also comes with its abil­i­ty to spread and not choke these things that pre­vi­ous gate­keep­ers might’ve not spread so much. Or they might’ve. This is a very impor­tant inflec­tion point.

I gave you one sto­ry of Facebook sup­press­ing, and one sto­ry of Facebook not sup­press­ing. Because I don’t want to sound like there is an easy way to do this. I don’t want to say Facebook should just you know, cen­sor this kind of hate speech and not cen­sor that kind of thing, because this is not…we’re talk­ing a bil­lion and a half peo­ple there. It should do a lot more, but it should cen­sor noth­ing, it should sup­press noth­ing,” is not an answer because some­thing’s going to be shown. There’s going to be choic­es made. There’s no escap­ing the dif­fi­cult choic­es.

So, I want to show you guys one more thing from my own per­son­al feed, which is why go to alter­na­tive sites” is not going to work. This is a post I had— I just came from Turkey. This is my grand­moth­er. She’s doing fine now. She had a stroke, I was real­ly wor­ried. Just before com­ing here I flew there. I post­ed on Facebook this lit­tle update. I was able to con­nect, using this, to rel­a­tives and acquain­tances who mobi­lized to help her and sup­port her at this time. That there is no oth­er tools that I can even imag­ine… A lot of these peo­ple, they don’t email. They do not have any email. They might’ve had an email that some­body set up for them to set up Facebook, but if I said, Can I have your email?” they’re like, Uh, my phone num­ber?” Like, No, your email. Can you Facebook friend me?”

And there are dozens of dozens of peo­ple like that in my sort of rel­a­tives and acquain­tances. So I would­n’t have been able to orga­nize this had I not had a tool that had this much cen­tral­iza­tion behind it. I’m throw­ing out these things to com­pli­cate it. I can get off Facebook. That is not the ques­tion. The ques­tion is how do we cre­ate alter­nate ways of doing these things? So the way Twitter played a role in push­ing Ferguson in spite of Facebook. That we have ways of get­ting around choke points.

But how do we also deal with the fact that when you have no choke points, like Burma—I don’t know if Facebook’s check­ing what’s going on, I don’t know how much staff they have. It’s a lit­tle coun­try. They have lots of coun­tries. It’s just one lit­tle coun­try. But you have the sort of eth­nic cleans­ing sit­u­a­tion going on. So if they don’t do some­thing to put their thumbs on scales one way or another—which they are going to do it, one way or another—that’s not an answer, either.

I don’t think we have you know, let’s just cre­ate decen­tral­ized things” as an easy option. But I don’t think we also are going towards a very healthy world, because our cen­tral­ized options are all ad-financed, encour­ag­ing viral­i­ty, clickability…which is encour­ag­ing polar­iza­tion, false news, out­rage, or the feel-good stuff. That’s why you have a lot of babies on your feed, and that’s why you have here in England a lot of Brexit fights on your feed. Both of them attract eye­balls, which is what Facebook is sell­ing.

So I’m going to sort of open it up to ques­tions, because I don’t have a con­clu­sion. I have a lot of ques­tions. But I feel, giv­en the lev­el of com­bi­na­tion of both elite fail­ure and these dynam­ics com­ing up in the world that are shak­ing mul­ti­ple coun­tries, how do we orga­nize our pub­lic sphere to encour­age health­i­er, inclu­sive behav­iors rather than just out­rage or feel-good? It prob­a­bly is one of the most impor­tant ques­tions we face right now, so we’ll take ques­tions for the remain­der of the time. And I wel­come oth­er ways of con­tact­ing me. I did put my email.


Audience 1: There are a lot of big mass-market brand adver­tis­ers that don’t just want to reach a nation­al­ist niche or a spe­cif­ic side of an argu­ment. What are big com­pa­nies that put mon­ey into Facebook adver­tis­ing doing about hav­ing their brands asso­ci­at­ed with this kind of speech?

Zeynep Tufekci: Well, I think they can tar­get many dif­fer­ent groups. Obviously, it’s true both on Facebook and Twitter, towards more free­wheel­ing, which means some adver­tis­ers don’t like that as much. I think for many of them, they’re… I mean, if you’re Coca-Cola, you prob­a­bly care about it as much. But if you’re adver­tis­ing say in a coun­try like Turkey or India or Burma, I don’t think it’s an imped­i­ment. I don’t think that’s prob­lem. And also what you can do is you can just seg­ment your audi­ences. And we live with adver­tis­ing. I think it just seems to be a fact of life, because peo­ple are there and there you are.

Audience 2: Hi, I’m inter­est­ed in whether you think it’s pri­mar­i­ly a tech­nol­o­gy solu­tion or a [tool?] solu­tion.

Tufecki: Okay, so this is a multi-level prob­lem. We absolute­ly need more resources and alter­na­tive tech­nolo­gies, because right now—because of net­work effects, there are no real mar­ket con­straints on Facebook. You’re not going out­com­pete it by itself. So we should try to make sure there are alter­na­tives that peo­ple could use that put some mar­ket pres­sure on Facebook. So that’s one thing we should do. And it’s not just Facebook. This is true for those few big platforms—for Twitter, for Google and YouTube. So alter­na­tives are impor­tant.

I think push­ing, sort of com­pli­cat­ing, or I guess look, I got on the Internet when it came to Turkey. Like, I was one of the first few users. I loved it. It opened up my world, and I thought, like Mark Zuckerberg says right now, open and con­nect­ed is great. It is much more com­pli­cat­ed than that, as we’ve come to see. There’s no idea that if we open every­thing up and make con­nec­tion just pos­si­ble only good things are going to hap­pen. So we need to com­pli­cate how we think about con­nec­tion and inclu­siv­i­ty, and bring more nor­ma­tive val­ues besides includ­ing peo­ple” into the con­ver­sa­tion. Because includ­ing you know, white suprema­cist so that they can go viral, so that they can go harass black vot­ers, is also a form of inclu­sion but you have to say inclu­sion by itself is not the only val­ue. It has to be paired with all the oth­er things we’re bal­anc­ing, includ­ing oth­er peo­ple’s right to exist.

So there’s that. And there’s also resource prob­lems here. Because if we’re going to address these things, the way we do a lot of things on the Internet is a mer­ry band of vol­un­teer geeks come togeth­er and type up some stuff. And it’s great, and it does some things. But minor things like down­time means you’re nev­er going to be inclu­sive enough to be an alter­na­tive, which requires resources. So there’s a multi-level, I think, answer.

And final­ly, the final answer is real­ly a new way of grap­pling with the ques­tion, What does hap­pen if bil­lions of peo­ple are con­nect­ed?” I think we just start­ed ask­ing that ques­tion, and some of the answers are not as pret­ty and sort of rosy as we might’ve thought when this start­ed. And I include myself in the sort of evolv­ing [?].

Audience 3: Hi. I was won­der­ing how you would talk about fold­ing in how the ad struc­ture and some of the fea­tures of Facebook allow for a monetarily-based form of sup­pres­sion. Recently they’ve been report­ing on how ads have have been specif­i­cal­ly avoid­ing tar­get­ing black and poor­er vot­ers, as well as there being dark ad cam­paigns to sup­press votes.

Tufecki: So, thank you very much for that ques­tion. The oth­er thing this pub­lic sphere allows— And I wrote this in 2012 on the Obama cam­paign, which had used sort of sophis­ti­cat­ed vot­er tar­get­ing. And I said wait, sophis­ti­cat­ed vot­er tar­get­ing? In a pri­vate place where you don’t see what oth­er peo­ple can see? Means that a cam­paign could come and tar­get hate speech to peo­ple who are recep­tive to it, and nobody else sees it. Because if it’s on TV, it’s hor­ri­ble to have hor­ri­ble things on TV. But at least it’s there to be pub­licly chal­lenged and coun­tered.

And so I wor­ried about this kind of silent, pri­vate but wide­spread hate speech or oth­er sort of incite­ment being used by polit­i­cal cam­paigns. And I wrote this in the con­text of [2012]—Obama’s elec­tion. A lot of my friends were like, We like Obama.” I said it does­n’t mat­ter. This is not Thor’s ham­mer, that only the purest of heart can pick up, right? This isn’t going to work that way. And I got a lot of flack from my friends who had worked on Obama’s cam­paign, and who were data sci­en­tists. And they said, No no, this will be our tool.”

Well, enter 2016 and ProPublica just pub­lished this thing that shows that on Facebook you can lit­er­al­ly seg­ment so that you can exclude African Americans—or any oth­er ethnicity—from your ad tar­get­ing. Now, this is what hap­pens when engi­neers do not talk to oth­er peo­ple while say­ing Let’s put more selec­tors out there. Isn’t this great?” No, it’s not. It’s ille­gal if it’s hous­ing, and it can do all sort of oth­er hor­ri­ble things. If it’s a job ad, can you imag­ine? Here’s a job ad, but don’t show it to these eth­nic­i­ties. Housing ad, don’t show it. Or, here’s a let’s orga­nize vot­er pro­tec­tion” (which is actu­al­ly let’s go harass black peo­ple at the polls”), but only tar­get white peo­ple.

All of this had been pos­si­ble until now. And it would’ve been pri­vate. So this is anoth­er thing. I had sug­gest­ed that there be a cen­tral­ized depos­i­to­ry where Facebook lets us know who tar­get­ed whom. It is so impor­tant that yes, they’re not going to com­mer­cial­ly tell us I got this or you got that, but they can do cer­tain things. And I think the fact that such a sim­ple thing—you do not allow this kind of exclu­sion for cer­tain kinds of ads—passed by them, shows that they espe­cial­ly need to do it.

But. Last thought: with enough data on you, I do not need to know your race, as a cat­e­go­ry, to deduce your prob­a­ble race, using prox­ies. So even if Facebook cleans this up—which they will, because it’s illegal—it is absolute­ly pos­si­ble to use com­pu­ta­tion­al infer­ence to do some­thing like adver­tise only to black women who are feel­ing depressed these days” some snake oil solu­tions, with­out using black or depressed as direct selec­tors. Because with enough data you can tell these things.

So not only is this new pub­lic sphere cen­tral­ize and choke­hold, it has— The past one was tele­vi­sion, base­ball bat kind of bang­ing you on the head and you went ow.” It was crude. Right now we have a scalpel that can do exclu­sion in tar­get­ing in all sorts of ways that it could­n’t. And maybe there are advan­tages to not hav­ing this much pre­cise pow­er in the hands of peo­ple with pow­er and mon­ey already. So that’s… I mean, it’s a great extra lay­er of dif­fi­cult ques­tions to deal with.

Further Reference

MozFest 2016 web site


Help Support Open Transcripts

If you found this useful or interesting, please consider supporting the project monthly at Patreon or once via Cash App, or even just sharing the link. Thanks.