Zeynep Tufekci: Hi, everyone. It is a great pleasure to be here. Thank you for inviting me. And I want to talk about two things. I want to talk about inclusion and centralization, and exclusion and decentralization. So, a lot of times these conversations revolve around Facebook. And it’s not just because Facebook is the biggest platform there is for these things. But it really is the best example of sort of these dual trends.
To give a little bit of background on how important Facebook is, I’d urge you, especially if you’re technically competent and you run your own Linux and you’re very good at multiple platforms and different things, to step outside that box and think of inclusion globally. Where there are many people—billions of people, in fact—for whom tasks that might seem intuitive and obvious to you are not. On the technically more competent, or people who’ve been using these platform for a longer time, or computers for a longer time, we tend to underestimate what it means to have something usable. I work with people around the world all the time, and a lot of times even the simplest—you know, where is the search bar versus the URL bar… Those are confusing.
So, a couple of major platforms like Facebook and Twitter, YouTube, have become in many places around the world a de facto public sphere. Especially in countries that have less than free Internet, less than free mass media. And these countries have transitioned from a very controlled public sphere to a commercially-run one like Facebook, but [one] open to most people in the space of a few years. So that’s the context. Once again, I worry about these things not just in the United States or the United Kingdom where there are a lot of issues with the mass media system, but it’s nothing like a place like Turkey, which I’m from. Or Burma, which has transitioned into a more democratic state in just a couple of years, which I’ll use as an example. So these are the kind of places we’re talking about.
So to explain this conflict, I’m going to talk about an example from the United States that happened a couple of years ago that shows how the inclusion/exclusion dynamics work. Now, on Facebook as you might have heard, it’s a common discussion— There is a dynamic called “network effects;” sometimes people call it “network externalities.” And these refer to the idea that if you have say, a fax machine and you want to fax people, something that’s compatible is the one you’re going to use. So if you want to message people, you’re going to message them on the platform they’re on—if they’re on Facebook you’re going to message them on Facebook. So that has really led to a huge amount of growth.
But that’s not the only thing. It’s also quite usable and easy to access. But it’s a commercial platform. It’s the public sphere, but it’s run to sell ads. Which means that there are all sorts of decisions that go into it that mediate what you get to see. And in particular on Facebook, it’s run by the news updates that you see from your friends, from your family, from acquaintances, from pages you follow. They are prioritized and picked by an algorithm that Facebook sets, often changes, and decides what to do with. And this algorithm of course is meant to increase—as far as they’re telling us, and I believe it—to increase “engagement” on the site, which means to keep you there. A lot of consequences.
So I’m going to show you guys a picture. On August 13, 2014, I was on Twitter one quiet evening, and I was looking at news coming from a small protest in Ferguson, Missouri in the United States. For the year before that, or two years before that, there had been a lot of social media noise voluntarily made by people who were trying to initiate a movement which has since—we call it Black Lives Matter, but at the beginning it was kind of just people discussing various cases that weren’t getting a lot attention. And one of the first ones I noticed was the killing of Trayvon Martin in Florida, where the killer wasn’t even tried in court, and there were a lot of publicity efforts around that on social media for weeks. And then it got some mass media, and there was a trial, even though there wasn’t conviction.
So on August 13 about a year after that, there was another news, a very sad one. An African American teenager had been shot and killed in Ferguson, Missouri. And there had just been tornadoes in the area a couple of days before. And a couple of journalists were nearby, so they went.
So in Ferguson, this community was angry that another young man had been shot. And his body had been just left there for hours. They were upset, they were grieving. And they started having these small-scale protests. And this is kinda like suburban US, right.
So, the journalists go there. There are these protests. And this is what we started seeing on Twitter. This is like armored vehicles, snipers. I put up my tweets but there are a couple of these journalists, and they’re just sort of taking these pictures. And I started retweeting, as you can see. It wasn’t much but I was like, “What is going on?”
I have a lot of friends around the world, especially in a lot of more repressive countries,and they were like, “Zeynep, are those pictures from Bahrain or someplace else? What’s going on?” And my friends in Egypt were talking about this. And there was this huge amount of discussion of this protest on my Twitter timeline, which is not algorithmically mediated. It’s just chronological.
So I went over to Facebook, where I have a lot of friends. And I said alright, what are my friends there seeing? What is going on? They weren’t talking about it. Okay…maybe my friends on Facebook don’t care about this.
I went back to Twitter, and I saw that all my friends, it seemed, were talking about it. It was just completely consuming the conversation. You know how you have some big news event, or Olympics or something, and everybody’s talking about it? That’s how it felt. I’d go to Facebook…nobody cares. Maybe I have the wrong kind of friends on Facebook, you know? You just wonder. You go to Twitter, it’s all there.
But I don’t really have different friends on Facebook. I have more family, you know, acquaintances. So I switched my Facebook feed to chronological. Which Facebook doesn’t want to let you do, because it wants to control that feed, which is your eyeballs, which is your attention, which in this day and age of attention being the key resource is crucial. It wants to keep you there, it wants to control that.
When I switched to chronological, I saw that my friends were talking about it, but Facebook wasn’t showing it to me. Back to Twitter, it’s all Ferguson. Switch to Facebook, if I force it chronological it’s Ferguson, Ferguson. But boom, it keeps coming back. Because Facebook switches it back to the out group.
So I said wait, what’s going on? And I started asking around on Twitter, which is where I could ask. There were hundreds, maybe thousands of posts, of people who were complaining about this. They were like, “Why isn’t…why aren’t I seeing this on Facebook? Why is this huge news not there?” So I just kept going back and forth for the evening.
So what happened was that instead of showing us this news, Facebook was showing us the Ice Bucket Challenge of August 2014. Do you remember that? It was a good thing. People were dumping buckets of ice water on themselves and challenging each other to donate to ALS charity. And they were doing this by tagging people and recording video. These are all things Facebook’s algorithm likes. Especially—it’s changed a little bit since then, but at the time you could only Like something. Right now you can do a heart, or you can do tears, which are a little more range of options. How do you Like the one on the left side? How do you like that? It’s not a likable story.
I saw this again and again. I work on refugee support, and I come across a lot of heartbreaking pieces of news and media. If I post them, they don’t really get around—they’re not likable. The switch to heart and cries has helped. Which shows you how important these decisions are.
So later, because of another controversy on Facebook trending topics, which is different than what I’m talking about, News Feed, they published some internal guidelines with some demo tools. And by coincidence, they happened to have a shot of that week’s data— Not by coincidence. Obviously that was something they were worried about, too, because it got attention. And it showed yes, ALS Ice Bucket Challenge had drowned out the Ferguson news.
About three to four million tweets were sent before either mass media or Facebook’s algorithm would acknowledge what was going on. This really makes me wonder have we ever had a choke point of centralization that controlled what a billion and a half people might see in this roundabout way, in which they don’t necessarily control what you’re going to see but they control the game on which we play to see what you will see. They set the rules. And their rules at the moment are engagement. Which means also virality. Which means we have studies showing now (It’s hard to study because you don’t even see it. Facebook has the data.) that fake news, outrageous news, angry claims—or very heart-warming stories—are what goes viral.
Now, this you’re going to say is human nature, people like this. This is very true. People like these things. People like to feel better. They seek views themselves. But we have a feedback loop in which what people are kind of leaning towards is shoved at them very fast.
So with Ferguson, what had happened was also a feedback loop. Because it was hard to Like the Ferguson news, the algorithm showed it to fewer people, which led to fewer exposures, which led to fewer people being able to comment on it, which led to this algorithmic spiral of silence.
We have the other things. We have for example—right now in the United States I see this every day, false stories about prevalence of vote fraud are going viral on Facebook all the time. Because they’re outrageous. They’re false, but they’re outrageous. And this is fueling people organizing themselves—vigilantes—to show up at the polls to “monitor” in “inner cities.” Meaning they are going to go harass black people trying to vote. This is 2016.
Now, you might think you know, this is kinda exaggerated. Is this really going to happen? In Burma, we transitioned from a very closed public sphere in which the military controlled everything to one of the fastest digital expansions. Everybody has some sort of feature phone or smartphone and some kind of access.
We have had Facebook posts— Facebook is the public sphere. We have had instances of outrageous, horrible, obviously false claims of the growing Muslim minority apparently do— You know, it’s the kind of things you see in ethnic cleansing situations. Claims they kill babies or they do these horrible things, that have led to these kinds of hate speech going viral in a context of ethnic cleavages where we don’t have intermediary institutions and hundreds of years of state-building. And they have led to hundreds of deaths in the past few years. And the biggest refugee outflow in the region. And maybe third or fourth in the world right now, because there are so many refugee outflows. This is…I mean, I know, open and connected and inclusive sounds great. But it comes with all these things. It comes with Facebook’s ability to both choke a piece of news. But it also comes with its ability to spread and not choke these things that previous gatekeepers might’ve not spread so much. Or they might’ve. This is a very important inflection point.
I gave you one story of Facebook suppressing, and one story of Facebook not suppressing. Because I don’t want to sound like there is an easy way to do this. I don’t want to say Facebook should just you know, censor this kind of hate speech and not censor that kind of thing, because this is not…we’re talking a billion and a half people there. It should do a lot more, but “it should censor nothing, it should suppress nothing,” is not an answer because something’s going to be shown. There’s going to be choices made. There’s no escaping the difficult choices.
So, I want to show you guys one more thing from my own personal feed, which is why “go to alternative sites” is not going to work. This is a post I had— I just came from Turkey. This is my grandmother. She’s doing fine now. She had a stroke, I was really worried. Just before coming here I flew there. I posted on Facebook this little update. I was able to connect, using this, to relatives and acquaintances who mobilized to help her and support her at this time. That there is no other tools that I can even imagine… A lot of these people, they don’t email. They do not have any email. They might’ve had an email that somebody set up for them to set up Facebook, but if I said, “Can I have your email?” they’re like, “Uh, my phone number?” Like, “No, your email. Can you Facebook friend me?”
And there are dozens of dozens of people like that in my sort of relatives and acquaintances. So I wouldn’t have been able to organize this had I not had a tool that had this much centralization behind it. I’m throwing out these things to complicate it. I can get off Facebook. That is not the question. The question is how do we create alternate ways of doing these things? So the way Twitter played a role in pushing Ferguson in spite of Facebook. That we have ways of getting around choke points.
But how do we also deal with the fact that when you have no choke points, like Burma—I don’t know if Facebook’s checking what’s going on, I don’t know how much staff they have. It’s a little country. They have lots of countries. It’s just one little country. But you have the sort of ethnic cleansing situation going on. So if they don’t do something to put their thumbs on scales one way or another—which they are going to do it, one way or another—that’s not an answer, either.
I don’t think we have you know, “let’s just create decentralized things” as an easy option. But I don’t think we also are going towards a very healthy world, because our centralized options are all ad-financed, encouraging virality, clickability…which is encouraging polarization, false news, outrage, or the feel-good stuff. That’s why you have a lot of babies on your feed, and that’s why you have here in England a lot of Brexit fights on your feed. Both of them attract eyeballs, which is what Facebook is selling.
So I’m going to sort of open it up to questions, because I don’t have a conclusion. I have a lot of questions. But I feel, given the level of combination of both elite failure and these dynamics coming up in the world that are shaking multiple countries, how do we organize our public sphere to encourage healthier, inclusive behaviors rather than just outrage or feel-good? It probably is one of the most important questions we face right now, so we’ll take questions for the remainder of the time. And I welcome other ways of contacting me. I did put my email.
Audience 1: There are a lot of big mass-market brand advertisers that don’t just want to reach a nationalist niche or a specific side of an argument. What are big companies that put money into Facebook advertising doing about having their brands associated with this kind of speech?
Zeynep Tufekci: Well, I think they can target many different groups. Obviously, it’s true both on Facebook and Twitter, towards more freewheeling, which means some advertisers don’t like that as much. I think for many of them, they’re… I mean, if you’re Coca-Cola, you probably care about it as much. But if you’re advertising say in a country like Turkey or India or Burma, I don’t think it’s an impediment. I don’t think that’s problem. And also what you can do is you can just segment your audiences. And we live with advertising. I think it just seems to be a fact of life, because people are there and there you are.
Audience 2: Hi, I’m interested in whether you think it’s primarily a technology solution or a [tool?] solution.
Tufecki: Okay, so this is a multi-level problem. We absolutely need more resources and alternative technologies, because right now—because of network effects, there are no real market constraints on Facebook. You’re not going outcompete it by itself. So we should try to make sure there are alternatives that people could use that put some market pressure on Facebook. So that’s one thing we should do. And it’s not just Facebook. This is true for those few big platforms—for Twitter, for Google and YouTube. So alternatives are important.
I think pushing, sort of complicating, or I guess look, I got on the Internet when it came to Turkey. Like, I was one of the first few users. I loved it. It opened up my world, and I thought, like Mark Zuckerberg says right now, open and connected is great. It is much more complicated than that, as we’ve come to see. There’s no idea that if we open everything up and make connection just possible only good things are going to happen. So we need to complicate how we think about connection and inclusivity, and bring more normative values besides “including people” into the conversation. Because including you know, white supremacist so that they can go viral, so that they can go harass black voters, is also a form of inclusion but you have to say inclusion by itself is not the only value. It has to be paired with all the other things we’re balancing, including other people’s right to exist.
So there’s that. And there’s also resource problems here. Because if we’re going to address these things, the way we do a lot of things on the Internet is a merry band of volunteer geeks come together and type up some stuff. And it’s great, and it does some things. But minor things like downtime means you’re never going to be inclusive enough to be an alternative, which requires resources. So there’s a multi-level, I think, answer.
And finally, the final answer is really a new way of grappling with the question, “What does happen if billions of people are connected?” I think we just started asking that question, and some of the answers are not as pretty and sort of rosy as we might’ve thought when this started. And I include myself in the sort of evolving [?].
Audience 3: Hi. I was wondering how you would talk about folding in how the ad structure and some of the features of Facebook allow for a monetarily-based form of suppression. Recently they’ve been reporting on how ads have have been specifically avoiding targeting black and poorer voters, as well as there being dark ad campaigns to suppress votes.
Tufecki: So, thank you very much for that question. The other thing this public sphere allows— And I wrote this in 2012 on the Obama campaign, which had used sort of sophisticated voter targeting. And I said wait, sophisticated voter targeting? In a private place where you don’t see what other people can see? Means that a campaign could come and target hate speech to people who are receptive to it, and nobody else sees it. Because if it’s on TV, it’s horrible to have horrible things on TV. But at least it’s there to be publicly challenged and countered.
And so I worried about this kind of silent, private but widespread hate speech or other sort of incitement being used by political campaigns. And I wrote this in the context of [2012]—Obama’s election. A lot of my friends were like, “We like Obama.” I said it doesn’t matter. This is not Thor’s hammer, that only the purest of heart can pick up, right? This isn’t going to work that way. And I got a lot of flack from my friends who had worked on Obama’s campaign, and who were data scientists. And they said, “No no, this will be our tool.”
Well, enter 2016 and ProPublica just published this thing that shows that on Facebook you can literally segment so that you can exclude African Americans—or any other ethnicity—from your ad targeting. Now, this is what happens when engineers do not talk to other people while saying “Let’s put more selectors out there. Isn’t this great?” No, it’s not. It’s illegal if it’s housing, and it can do all sort of other horrible things. If it’s a job ad, can you imagine? Here’s a job ad, but don’t show it to these ethnicities. Housing ad, don’t show it. Or, here’s a “let’s organize voter protection” (which is actually “let’s go harass black people at the polls”), but only target white people.
All of this had been possible until now. And it would’ve been private. So this is another thing. I had suggested that there be a centralized depository where Facebook lets us know who targeted whom. It is so important that yes, they’re not going to commercially tell us I got this or you got that, but they can do certain things. And I think the fact that such a simple thing—you do not allow this kind of exclusion for certain kinds of ads—passed by them, shows that they especially need to do it.
But. Last thought: with enough data on you, I do not need to know your race, as a category, to deduce your probable race, using proxies. So even if Facebook cleans this up—which they will, because it’s illegal—it is absolutely possible to use computational inference to do something like “advertise only to black women who are feeling depressed these days” some snake oil solutions, without using black or depressed as direct selectors. Because with enough data you can tell these things.
So not only is this new public sphere centralize and chokehold, it has— The past one was television, baseball bat kind of banging you on the head and you went “ow.” It was crude. Right now we have a scalpel that can do exclusion in targeting in all sorts of ways that it couldn’t. And maybe there are advantages to not having this much precise power in the hands of people with power and money already. So that’s… I mean, it’s a great extra layer of difficult questions to deal with.
Further Reference
Citations
- Culturas algorítmicas: conceptos y métodos para su estudio social [“Algorithmic Cultures: Concepts and Methods for their Social Study”]
- Antisocial Media: How Facebook Disconnects Us and Undermines Democracy
- Algorithmic Audiencing: Why we need to rethink free speech on social media
- White folks’ work: digital allyship praxis in the #BlackLivesMatter movement