Zeynep Tufekci: Hi, every­one. It is a great plea­sure to be here. Thank you for invit­ing me. And I want to talk about two things. I want to talk about inclu­sion and cen­tral­iza­tion, and exclu­sion and decen­tral­iza­tion. So, a lot of times these con­ver­sa­tions revolve around Facebook. And it’s not just because Facebook is the biggest plat­form there is for these things. But it real­ly is the best exam­ple of sort of these dual trends.

To give a lit­tle bit of back­ground on how impor­tant Facebook is, I’d urge you, espe­cial­ly if you’re tech­ni­cal­ly com­pe­tent and you run your own Linux and you’re very good at mul­ti­ple plat­forms and dif­fer­ent things, to step out­side that box and think of inclu­sion glob­al­ly. Where there are many people—billions of peo­ple, in fact—for whom tasks that might seem intu­itive and obvi­ous to you are not. On the tech­ni­cal­ly more com­pe­tent, or peo­ple who’ve been using these plat­form for a longer time, or com­put­ers for a longer time, we tend to under­es­ti­mate what it means to have some­thing usable. I work with peo­ple around the world all the time, and a lot of times even the simplest—you know, where is the search bar ver­sus the URL bar… Those are con­fus­ing.

So, a cou­ple of major plat­forms like Facebook and Twitter, YouTube, have become in many places around the world a de fac­to pub­lic sphere. Especially in coun­tries that have less than free Internet, less than free mass media. And these coun­tries have tran­si­tioned from a very con­trolled pub­lic sphere to a commercially-run one like Facebook, but [one] open to most peo­ple in the space of a few years. So that’s the con­text. Once again, I wor­ry about these things not just in the United States or the United Kingdom where there are a lot of issues with the mass media sys­tem, but it’s noth­ing like a place like Turkey, which I’m from. Or Burma, which has tran­si­tioned into a more demo­c­ra­t­ic state in just a cou­ple of years, which I’ll use as an exam­ple. So these are the kind of places we’re talk­ing about.

So to explain this con­flict, I’m going to talk about an exam­ple from the United States that hap­pened a cou­ple of years ago that shows how the inclusion/exclusion dynam­ics work. Now, on Facebook as you might have heard, it’s a com­mon dis­cus­sion— There is a dynam­ic called net­work effects;” some­times peo­ple call it net­work exter­nal­i­ties.” And these refer to the idea that if you have say, a fax machine and you want to fax peo­ple, some­thing that’s com­pat­i­ble is the one you’re going to use. So if you want to mes­sage peo­ple, you’re going to mes­sage them on the plat­form they’re on—if they’re on Facebook you’re going to mes­sage them on Facebook. So that has real­ly led to a huge amount of growth.

But that’s not the only thing. It’s also quite usable and easy to access. But it’s a com­mer­cial plat­form. It’s the pub­lic sphere, but it’s run to sell ads. Which means that there are all sorts of deci­sions that go into it that medi­ate what you get to see. And in par­tic­u­lar on Facebook, it’s run by the news updates that you see from your friends, from your fam­i­ly, from acquain­tances, from pages you fol­low. They are pri­or­i­tized and picked by an algo­rithm that Facebook sets, often changes, and decides what to do with. And this algo­rithm of course is meant to increase—as far as they’re telling us, and I believe it—to increase engage­ment” on the site, which means to keep you there. A lot of con­se­quences.

So I’m going to show you guys a pic­ture. On August 13, 2014, I was on Twitter one qui­et evening, and I was look­ing at news com­ing from a small protest in Ferguson, Missouri in the United States. For the year before that, or two years before that, there had been a lot of social media noise vol­un­tar­i­ly made by peo­ple who were try­ing to ini­ti­ate a move­ment which has since—we call it Black Lives Matter, but at the begin­ning it was kind of just peo­ple dis­cussing var­i­ous cas­es that weren’t get­ting a lot atten­tion. And one of the first ones I noticed was the killing of Trayvon Martin in Florida, where the killer wasn’t even tried in court, and there were a lot of pub­lic­i­ty efforts around that on social media for weeks. And then it got some mass media, and there was a tri­al, even though there wasn’t con­vic­tion.

So on August 13 about a year after that, there was anoth­er news, a very sad one. An African American teenag­er had been shot and killed in Ferguson, Missouri. And there had just been tor­na­does in the area a cou­ple of days before. And a cou­ple of jour­nal­ists were near­by, so they went.

So in Ferguson, this com­mu­ni­ty was angry that anoth­er young man had been shot. And his body had been just left there for hours. They were upset, they were griev­ing. And they start­ed hav­ing these small-scale protests. And this is kin­da like sub­ur­ban US, right.

Tweets, Zeynep Tufecki, August 13, 2014, at 6:13 PMand 8:27 PM

So, the jour­nal­ists go there. There are these protests. And this is what we start­ed see­ing on Twitter. This is like armored vehi­cles, snipers. I put up my tweets but there are a cou­ple of these jour­nal­ists, and they’re just sort of tak­ing these pic­tures. And I start­ed retweet­ing, as you can see. It wasn’t much but I was like, What is going on?”

I have a lot of friends around the world, espe­cial­ly in a lot of more repres­sive countries,and they were like, Zeynep, are those pic­tures from Bahrain or some­place else? What’s going on?” And my friends in Egypt were talk­ing about this. And there was this huge amount of dis­cus­sion of this protest on my Twitter time­line, which is not algo­rith­mi­cal­ly medi­at­ed. It’s just chrono­log­i­cal.

So I went over to Facebook, where I have a lot of friends. And I said alright, what are my friends there see­ing? What is going on? They weren’t talk­ing about it. Okay…maybe my friends on Facebook don’t care about this.

I went back to Twitter, and I saw that all my friends, it seemed, were talk­ing about it. It was just com­plete­ly con­sum­ing the con­ver­sa­tion. You know how you have some big news event, or Olympics or some­thing, and everybody’s talk­ing about it? That’s how it felt. I’d go to Facebook…nobody cares. Maybe I have the wrong kind of friends on Facebook, you know? You just won­der. You go to Twitter, it’s all there.

But I don’t real­ly have dif­fer­ent friends on Facebook. I have more fam­i­ly, you know, acquain­tances. So I switched my Facebook feed to chrono­log­i­cal. Which Facebook doesn’t want to let you do, because it wants to con­trol that feed, which is your eye­balls, which is your atten­tion, which in this day and age of atten­tion being the key resource is cru­cial. It wants to keep you there, it wants to con­trol that.

When I switched to chrono­log­i­cal, I saw that my friends were talk­ing about it, but Facebook wasn’t show­ing it to me. Back to Twitter, it’s all Ferguson. Switch to Facebook, if I force it chrono­log­i­cal it’s Ferguson, Ferguson. But boom, it keeps com­ing back. Because Facebook switch­es it back to the out group.

So I said wait, what’s going on? And I start­ed ask­ing around on Twitter, which is where I could ask. There were hun­dreds, maybe thou­sands of posts, of peo­ple who were com­plain­ing about this. They were like, Why isn’t…why aren’t I see­ing this on Facebook? Why is this huge news not there?” So I just kept going back and forth for the evening.

So what hap­pened was that instead of show­ing us this news, Facebook was show­ing us the Ice Bucket Challenge of August 2014. Do you remem­ber that? It was a good thing. People were dump­ing buck­ets of ice water on them­selves and chal­leng­ing each oth­er to donate to ALS char­i­ty. And they were doing this by tag­ging peo­ple and record­ing video. These are all things Facebook’s algo­rithm likes. Especially—it’s changed a lit­tle bit since then, but at the time you could only Like some­thing. Right now you can do a heart, or you can do tears, which are a lit­tle more range of options. How do you Like the one on the left side? How do you like that? It’s not a lik­able sto­ry.

I saw this again and again. I work on refugee sup­port, and I come across a lot of heart­break­ing pieces of news and media. If I post them, they don’t real­ly get around—they’re not lik­able. The switch to heart and cries has helped. Which shows you how impor­tant these deci­sions are.

So lat­er, because of anoth­er con­tro­ver­sy on Facebook trend­ing top­ics, which is dif­fer­ent than what I’m talk­ing about, News Feed, they pub­lished some inter­nal guide­lines with some demo tools. And by coin­ci­dence, they hap­pened to have a shot of that week’s data— Not by coin­ci­dence. Obviously that was some­thing they were wor­ried about, too, because it got atten­tion. And it showed yes, ALS Ice Bucket Challenge had drowned out the Ferguson news.

About three to four mil­lion tweets were sent before either mass media or Facebook’s algo­rithm would acknowl­edge what was going on. This real­ly makes me won­der have we ever had a choke point of cen­tral­iza­tion that con­trolled what a bil­lion and a half peo­ple might see in this round­about way, in which they don’t nec­es­sar­i­ly con­trol what you’re going to see but they con­trol the game on which we play to see what you will see. They set the rules. And their rules at the moment are engage­ment. Which means also viral­i­ty. Which means we have stud­ies show­ing now (It’s hard to study because you don’t even see it. Facebook has the data.) that fake news, out­ra­geous news, angry claims—or very heart-warming stories—are what goes viral.

Now, this you’re going to say is human nature, peo­ple like this. This is very true. People like these things. People like to feel bet­ter. They seek views them­selves. But we have a feed­back loop in which what peo­ple are kind of lean­ing towards is shoved at them very fast.

So with Ferguson, what had hap­pened was also a feed­back loop. Because it was hard to Like the Ferguson news, the algo­rithm showed it to few­er peo­ple, which led to few­er expo­sures, which led to few­er peo­ple being able to com­ment on it, which led to this algo­rith­mic spi­ral of silence.

We have the oth­er things. We have for example—right now in the United States I see this every day, false sto­ries about preva­lence of vote fraud are going viral on Facebook all the time. Because they’re out­ra­geous. They’re false, but they’re out­ra­geous. And this is fuel­ing peo­ple orga­niz­ing themselves—vigilantes—to show up at the polls to mon­i­tor” in inner cities.” Meaning they are going to go harass black peo­ple try­ing to vote. This is 2016.

Now, you might think you know, this is kin­da exag­ger­at­ed. Is this real­ly going to hap­pen? In Burma, we tran­si­tioned from a very closed pub­lic sphere in which the mil­i­tary con­trolled every­thing to one of the fastest dig­i­tal expan­sions. Everybody has some sort of fea­ture phone or smart­phone and some kind of access.

We have had Facebook posts— Facebook is the pub­lic sphere. We have had instances of out­ra­geous, hor­ri­ble, obvi­ous­ly false claims of the grow­ing Muslim minor­i­ty appar­ent­ly do— You know, it’s the kind of things you see in eth­nic cleans­ing sit­u­a­tions. Claims they kill babies or they do these hor­ri­ble things, that have led to these kinds of hate speech going viral in a con­text of eth­nic cleav­ages where we don’t have inter­me­di­ary insti­tu­tions and hun­dreds of years of state-building. And they have led to hun­dreds of deaths in the past few years. And the biggest refugee out­flow in the region. And maybe third or fourth in the world right now, because there are so many refugee out­flows. This is…I mean, I know, open and con­nect­ed and inclu­sive sounds great. But it comes with all these things. It comes with Facebook’s abil­i­ty to both choke a piece of news. But it also comes with its abil­i­ty to spread and not choke these things that pre­vi­ous gate­keep­ers might’ve not spread so much. Or they might’ve. This is a very impor­tant inflec­tion point.

I gave you one sto­ry of Facebook sup­press­ing, and one sto­ry of Facebook not sup­press­ing. Because I don’t want to sound like there is an easy way to do this. I don’t want to say Facebook should just you know, cen­sor this kind of hate speech and not cen­sor that kind of thing, because this is not…we’re talk­ing a bil­lion and a half peo­ple there. It should do a lot more, but it should cen­sor noth­ing, it should sup­press noth­ing,” is not an answer because something’s going to be shown. There’s going to be choic­es made. There’s no escap­ing the dif­fi­cult choic­es.

So, I want to show you guys one more thing from my own per­son­al feed, which is why go to alter­na­tive sites” is not going to work. This is a post I had— I just came from Turkey. This is my grand­moth­er. She’s doing fine now. She had a stroke, I was real­ly wor­ried. Just before com­ing here I flew there. I post­ed on Facebook this lit­tle update. I was able to con­nect, using this, to rel­a­tives and acquain­tances who mobi­lized to help her and sup­port her at this time. That there is no oth­er tools that I can even imag­ine… A lot of these peo­ple, they don’t email. They do not have any email. They might’ve had an email that some­body set up for them to set up Facebook, but if I said, Can I have your email?” they’re like, Uh, my phone num­ber?” Like, No, your email. Can you Facebook friend me?”

And there are dozens of dozens of peo­ple like that in my sort of rel­a­tives and acquain­tances. So I wouldn’t have been able to orga­nize this had I not had a tool that had this much cen­tral­iza­tion behind it. I’m throw­ing out these things to com­pli­cate it. I can get off Facebook. That is not the ques­tion. The ques­tion is how do we cre­ate alter­nate ways of doing these things? So the way Twitter played a role in push­ing Ferguson in spite of Facebook. That we have ways of get­ting around choke points.

But how do we also deal with the fact that when you have no choke points, like Burma—I don’t know if Facebook’s check­ing what’s going on, I don’t know how much staff they have. It’s a lit­tle coun­try. They have lots of coun­tries. It’s just one lit­tle coun­try. But you have the sort of eth­nic cleans­ing sit­u­a­tion going on. So if they don’t do some­thing to put their thumbs on scales one way or another—which they are going to do it, one way or another—that’s not an answer, either.

I don’t think we have you know, let’s just cre­ate decen­tral­ized things” as an easy option. But I don’t think we also are going towards a very healthy world, because our cen­tral­ized options are all ad-financed, encour­ag­ing viral­i­ty, clickability…which is encour­ag­ing polar­iza­tion, false news, out­rage, or the feel-good stuff. That’s why you have a lot of babies on your feed, and that’s why you have here in England a lot of Brexit fights on your feed. Both of them attract eye­balls, which is what Facebook is sell­ing.

So I’m going to sort of open it up to ques­tions, because I don’t have a con­clu­sion. I have a lot of ques­tions. But I feel, giv­en the lev­el of com­bi­na­tion of both elite fail­ure and these dynam­ics com­ing up in the world that are shak­ing mul­ti­ple coun­tries, how do we orga­nize our pub­lic sphere to encour­age health­i­er, inclu­sive behav­iors rather than just out­rage or feel-good? It prob­a­bly is one of the most impor­tant ques­tions we face right now, so we’ll take ques­tions for the remain­der of the time. And I wel­come oth­er ways of con­tact­ing me. I did put my email.


Audience 1: There are a lot of big mass-market brand advertisers that don't just want to reach a nationalist niche or a specific side of an argument. What are big companies that put money into Facebook advertising doing about having their brands associated with this kind of speech?

Zeynep Tufekci: Well, I think they can target many different groups. Obviously, it's true both on Facebook and Twitter, towards more freewheeling, which means some advertisers don't like that as much. I think for many of them, they're… I mean, if you're Coca-Cola, you probably care about it as much. But if you're advertising say in a country like Turkey or India or Burma, I don't think it's an impediment. I don't think that's problem. And also what you can do is you can just segment your audiences. And we live with advertising. I think it just seems to be a fact of life, because people are there and there you are.

Audience 2: Hi, I'm interested in whether you think it's primarily a technology solution or a [tool?] solution.

Tufecki: Okay, so this is a multi-level problem. We absolutely need more resources and alternative technologies, because right now—because of network effects, there are no real market constraints on Facebook. You're not going outcompete it by itself. So we should try to make sure there are alternatives that people could use that put some market pressure on Facebook. So that's one thing we should do. And it's not just Facebook. This is true for those few big platforms—for Twitter, for Google and YouTube. So alternatives are important.

I think pushing, sort of complicating, or I guess look, I got on the Internet when it came to Turkey. Like, I was one of the first few users. I loved it. It opened up my world, and I thought, like Mark Zuckerberg says right now, open and connected is great. It is much more complicated than that, as we've come to see. There's no idea that if we open everything up and make connection just possible only good things are going to happen. So we need to complicate how we think about connection and inclusivity, and bring more normative values besides "including people" into the conversation. Because including you know, white supremacist so that they can go viral, so that they can go harass black voters, is also a form of inclusion but you have to say inclusion by itself is not the only value. It has to be paired with all the other things we're balancing, including other people's right to exist.

So there's that. And there's also resource problems here. Because if we're going to address these things, the way we do a lot of things on the Internet is a merry band of volunteer geeks come together and type up some stuff. And it's great, and it does some things. But minor things like downtime means you're never going to be inclusive enough to be an alternative, which requires resources. So there's a multi-level, I think, answer.

And finally, the final answer is really a new way of grappling with the question, "What does happen if billions of people are connected?" I think we just started asking that question, and some of the answers are not as pretty and sort of rosy as we might've thought when this started. And I include myself in the sort of evolving [?].

Audience 3: Hi. I was wondering how you would talk about folding in how the ad structure and some of the features of Facebook allow for a monetarily-based form of suppression. Recently they've been reporting on how ads have have been specifically avoiding targeting black and poorer voters, as well as there being dark ad campaigns to suppress votes.

Tufecki: So, thank you very much for that question. The other thing this public sphere allows— And I wrote this in 2012 on the Obama campaign, which had used sort of sophisticated voter targeting. And I said wait, sophisticated voter targeting? In a private place where you don't see what other people can see? Means that a campaign could come and target hate speech to people who are receptive to it, and nobody else sees it. Because if it's on TV, it's horrible to have horrible things on TV. But at least it's there to be publicly challenged and countered.

And so I worried about this kind of silent, private but widespread hate speech or other sort of incitement being used by political campaigns. And I wrote this in the context of [2012]—Obama's election. A lot of my friends were like, "We like Obama." I said it doesn't matter. This is not Thor's hammer, that only the purest of heart can pick up, right? This isn't going to work that way. And I got a lot of flack from my friends who had worked on Obama's campaign, and who were data scientists. And they said, "No no, this will be our tool."

Well, enter 2016 and ProPublica just published this thing that shows that on Facebook you can literally segment so that you can exclude African Americans—or any other ethnicity—from your ad targeting. Now, this is what happens when engineers do not talk to other people while saying "Let's put more selectors out there. Isn't this great?" No, it's not. It's illegal if it's housing, and it can do all sort of other horrible things. If it's a job ad, can you imagine? Here's a job ad, but don't show it to these ethnicities. Housing ad, don't show it. Or, here's a "let's organize voter protection" (which is actually "let's go harass black people at the polls"), but only target white people.

All of this had been possible until now. And it would've been private. So this is another thing. I had suggested that there be a centralized depository where Facebook lets us know who targeted whom. It is so important that yes, they're not going to commercially tell us I got this or you got that, but they can do certain things. And I think the fact that such a simple thing—you do not allow this kind of exclusion for certain kinds of ads—passed by them, shows that they especially need to do it.

But. Last thought: with enough data on you, I do not need to know your race, as a category, to deduce your probable race, using proxies. So even if Facebook cleans this up—which they will, because it's illegal—it is absolutely possible to use computational inference to do something like "advertise only to black women who are feeling depressed these days" some snake oil solutions, without using black or depressed as direct selectors. Because with enough data you can tell these things.

So not only is this new public sphere centralize and chokehold, it has— The past one was television, baseball bat kind of banging you on the head and you went "ow." It was crude. Right now we have a scalpel that can do exclusion in targeting in all sorts of ways that it couldn't. And maybe there are advantages to not having this much precise power in the hands of people with power and money already. So that's… I mean, it's a great extra layer of difficult questions to deal with.

Further Reference

MozFest 2016 web site


Help Support Open Transcripts

If you found this useful or interesting, please consider supporting the project monthly at Patreon or once via Square Cash, or even just sharing the link. Thanks.