Jillian C. York: Thank you every­one. And I know that every­one here is inside instead of out­side where it’s still beau­ti­ful. And I know that I’m stand­ing between you and drinks. So I’m going to keep this talk fair­ly short.

But I want to just pref­ace this by say­ing that I have to say I don’t have all the solu­tions to this prob­lem. I don’t think that any­one does. And right now in the US, where I’m from, I heard the news just a cou­ple of hours ago that a young boy, an 8 year-old boy, was lynched in my home state. He sur­vived. He’s in a hos­pi­tal. But to me, what’s hap­pen­ing in my own coun­try right now is real­ly ter­ri­fy­ing and that’s why I’m seek­ing out solutions.

That said I also have to intro­duce this by say­ing that I’ve been work­ing for the past six or sev­en years as a free speech advo­cate. And free speech is a real­ly dif­fi­cult top­ic in the US right now, because a lot of peo­ple believe that free speech means that all speech is equal. I don’t believe that. However, I remain against cen­sor­ship and I’m going to tell you why.

So first of all what is hate speech? I think that there’s prob­a­bly a num­ber of dif­fer­ent peo­ple in this room from dif­fer­ent coun­tries, and we all have dif­fer­ent ideas about what this means. 

So, here’s the United Nations def­i­n­i­tion. The advo­ca­cy of nation­al, racial or reli­gious hatred that con­sti­tutes incite­ment to dis­crim­i­na­tion, hos­til­i­ty or vio­lence.” That’s prob­a­bly one of the bet­ter def­i­n­i­tions that I’ve seen of this topic.

…con­tent that direct­ly attacks peo­ple based on their race; eth­nic­i­ty; nation­al ori­gin; reli­gious affil­i­a­tion, sex­u­al ori­en­ta­tion; sex, gen­der, or gen­der iden­ti­ty; or seri­ous dis­abil­i­ties or diseases
Facebook Community Standards, Hate Speech sec­tion [pre­sen­ta­tion slide]

And then we have Facebook’s def­i­n­i­tion, which is a lit­tle bit more vague. Content direct­ly attacks peo­ple based on a num­ber of dif­fer­ent rea­sons, as you can see. And this is how Facebook adju­di­cates hate speech.

incite­ment to hatred against seg­ments of the pop­u­la­tion and refers to calls for vio­lent or arbi­trary mea­sures against them, includ­ing assaults against the human dig­ni­ty of oth­ers by insult­ing, mali­cious­ly malign­ing, or defam­ing seg­ments of the population.
[pre­sen­ta­tion slide]

We have the German law—I’m liv­ing in Germany right now, which is why I chose this one. Incitement to hatred against seg­ments of the pop­u­la­tion, spe­cif­ic calls for vio­lent or arbi­trary mea­sures against them, includ­ing assaults against human dig­ni­ty, by insult­ing, mali­cious­ly malign­ing, or defam­ing seg­ments of the pop­u­la­tion. I like that def­i­n­i­tion. It’s a lit­tle bit more spe­cif­ic and it deals with the top­ic of incite­ment, which I’m about to talk about.

1989 Prohibition of Incitement to Hatred Act bans writ­ten mate­r­i­al, words, behav­iour, visu­al images or sounds [that], as the case may be, are threat­en­ing, abu­sive or insult­ing and are intend­ed or, hav­ing regard to all the cir­cum­stances, are like­ly to stir up hatred.”
[pre­sen­ta­tion slide]

This one’s lit­tle bit out of date. I was in Dublin last week and I also gave this talk there. And this one looks at Ireland’s hate speech law, which I thought was real­ly inter­est­ing (The bold in the text is my own.) that they used the phrase to stir up hatred,” as though that’s a com­mon under­stand­ing that every­one has. And I found that rather inter­est­ing. The rea­son that I put these dif­fer­ent laws and dif­fer­ent rules on the screen is to demon­strate how much the def­i­n­i­tions of hate speech vary by place to place, com­pa­ny to com­pa­ny, gov­ern­ment to gov­ern­ment. Basically what I mean is that no two peo­ple seem to have the same def­i­n­i­tion of what hate speech means.

And here’s the United States, of course, [screen not vis­i­ble] where we don’t have hate speech as a con­cept, legal­ly. In the US, the First Amendment pro­tects all speech, includ­ing hate­ful speech but not includ­ing incite­ment. And that’s why I want to make this line between hate speech and incitement. 

So, what is the harm of hate speech? I think that this is clear to a lot of peo­ple. I’m sure that many of you have expe­ri­enced hate speech in one form or anoth­er, whether it’s against you are against some­one you know, some­one you care about, or just a group in your own country.

Illustration of a lit match, with "are all muslims terrorists?" along the stick

So this par­tic­u­lar slide referred to a spe­cif­ic study that was about the cor­re­la­tion between hate­ful speech, or incit­ing speech, against Muslims and hate crimes. That is to say vio­lent crimes against peo­ple specif­i­cal­ly. This was a US-based study. And what it did was it looked at search terms relat­ed to Islam and relat­ed to Muslims, such as kill Muslims” and are any Muslims not ter­ror­ists.” I don’t have the speak­er notes, but these are the type of search­es that these researchers looked at. And what they found is a direct geo­graph­ic cor­re­la­tion between those kinds of search terms and hate crimes, based on a peri­od of I think about eleven years across a num­ber of dif­fer­ent geo­gra­phies. And that cor­re­la­tion was very strong. And so we know that hate speech has a direct impact on indi­vid­u­als, that it can in fact have a vio­lent impact on indi­vid­u­als, par­tic­u­lar­ly when that speech is incit­ing vio­lence specifically.

So, the Dangerous Speech Project is a real­ly inter­est­ing project that I sug­gest every­one look at, which is why I’ve thrown the URL up there. They seek to make a dis­tinc­tion between the con­cept of hate speech, which can be a num­ber of things as we’ve noted—things that are just hurt­ful, things that hurt feel­ings, things that hurt peo­ple psy­cho­log­i­cal­ly. And I don’t want to dimin­ish that. But what they seek to do is draw the dif­fer­ence between things that are mere­ly harm­ing psy­cho­log­i­cal­ly and things that actu­al­ly have a direct cor­re­la­tion to violence.

So dan­ger­ous speech, as opposed to hate speech, is defined basi­cal­ly as speech that seeks to incite vio­lence against peo­ple. And that’s the kind of speech that I’m real­ly con­cerned about right now. That’s what we’re see­ing on the rise in the United States, in Europe, and elsewhere.

And so, what’s hap­pen­ing in result of this is that gov­ern­ments and com­pa­nies togeth­er are seek­ing ways to throw that speech under the rug and hide it from our eyes. Now what does this mean? Now, on the one hand you have gov­ern­ments like Germany’s gov­ern­ment which has had these laws in place for a long time, which seeks to cen­sor that speech based on long­stand­ing laws. And Germany is basi­cal­ly telling social media com­pa­nies, as this head­line says, Delete speech or pay financially.”

Now, com­pa­nies like Facebook and Google and Twitter have been resis­tant to these kinds of laws, but at the same time they also— (Just anoth­er one; the EU is also doing the same thing.) At the same time, these com­pa­nies have also always had their own rules in place that pre­vent hate­ful speech. And this is the point that I want to get to, that cen­sor­ship also has harms.

Now again, I am against cen­sor­ship but I am not for all speech. And I think that that’s a care­ful dis­tinc­tion that I need to make at this point in time. Because speech can harm, but cen­sor­ship can harm, too. 

A few years ago, I cofound­ed a project called onlinecen​sor​ship​.org. And what we do is we look specif­i­cal­ly at the harms of cen­sor­ship on mar­gin­al­ized pop­u­la­tions. We start­ed by encour­ag­ing peo­ple to sub­mit reports to us when they’d expe­ri­enced unjust or unjus­ti­fied cen­sor­ship. Of course we do receive oth­er kinds of reports, too, from peo­ple who are engag­ing in incite­ment and vio­lent kinds of racist speech. But a lot of the reports we receive are peo­ple who yes, are being cen­sored from oth­er kinds of rules that are not relat­ed to hate speech. But we also get reports from peo­ple who are cen­sored by com­pa­nies based on speech that com­pa­nies deem as hate speech but that actu­al­ly is not. And here’s a cou­ple of exam­ples of that. 

So this is one that came up a few weeks ago. Basically Facebook— I’m not sure how many peo­ple here are native English speak­ers so I’m going to explain this one. The term dyke” is a term that has been used to harm, to hurt, and to malign peo­ple. It’s used against les­bians. But at the same time, les­bians have also chosen—historically, for a long time now—to reclaim that word. And so they say, Okay, we can say dyke but you can’t use it against us.” And that’s fair. We have a lot of terms like this in the US and I’m sure that in oth­er lan­guages this kind of thing hap­pens as well. 

And so what was hap­pen­ing on Facebook is that they were basi­cal­ly sys­tem­at­i­cal­ly cen­sor­ing this term regard­less of the con­text in which it was used. And what that meant was that peo­ple’s expres­sion was lim­it­ed. People who want­ed to use this term in a reclaimed way, in an empow­ered way, such as this par­tic­u­lar com­ment where she was say­ing that People need to quit rewrit­ing his­to­ry. Dykes do things. #vis­i­bil­i­ty­mat­ters,” this is a pos­i­tive state­ment. But she was nev­er­the­less tem­porar­i­ly kicked off the plat­form for using just that word. We don’t know if this was algo­rith­mic or human con­tent moderation.

Another exam­ple of this has come up a few times in dif­fer­ent con­texts. But basi­cal­ly this head­line is from a writer called Ijeoma Oluo. And she wrote about a par­tic­u­lar expe­ri­ence where she had received a lot of racist harass­ment in a restau­rant. And she wrote about it on Facebook, and Facebook cen­sored her post because it con­tained cer­tain terms. Because the way that she used words—somebody had called her the n‑word—she’d used that word, and Facebook saw that word and cen­sored it.

And the rea­son that this hap­pens is some­thing that I’ve been study­ing for a long time. Basically when you’re on a social media plat­form, you’re encour­aged to snitch or to report on your friends and oth­er users. You’ve maybe seen this before. Facebook says, Okay, you don’t you don’t like this com­ment? Report it.” And you can say okay, this is hate speech, or I don’t like this com­ment, or whatever. 

But what hap­pens is that that post or that con­tent then goes into a queue, where either a human con­tent mod­er­a­tor or an algo­rithm deter­mines whether it’s accept­able or not accept­able speech, and then it’s tak­en down and per­haps there’s also some kind of puni­tive dam­age, pun­ish­ment, met­ed out to that indi­vid­ual. And in this case she was tem­porar­i­ly sus­pend­ed from the plat­form and her posts were deleted. 

This is not the only case like this. In fact it’s not the only famous case like this. And in fact it’s not the only famous case like this that hap­pened in the same week. The Washington Post and The New York Times both report­ed on dif­fer­ent cir­cum­stances that had hap­pened that same week, all involv­ing black mothers.

And here’s anoth­er one. And this has been hap­pen­ing on Twitter pret­ty con­sis­tent­ly recent­ly. This par­tic­u­lar case, @meakoopa was a—is; they got their account back, luckily—an LGBTQ aca­d­e­m­ic who had been tweet­ing about antifa, tweet­ing against white suprema­cy, and possibly—I don’t remem­ber this par­tic­u­lar case—but I’m going to gen­er­al­ize this because there were a num­ber of cas­es that week. There were a lot of peo­ple tweet­ing about punch­ing Nazis. And those peo­ple were all sys­tem­at­i­cal­ly cen­sored from Twitter. Because that was deemed to be hate speech. When you want to stand up against a vio­lent fas­cist ide­ol­o­gy, appar­ent­ly accord­ing to this com­pa­ny you too are vio­lent and you too must be censored. 

And so this is what scares me about cen­sor­ship, is the fact that it has nev­er in the his­to­ry of our lives been applied even­ly, in a way that does not have a neg­a­tive trickle-down effect on oth­er indi­vid­u­als and par­tic­u­lar­ly on mar­gin­al­ized groups.

And then of course, we have in Germany just a cou­ple weeks ago for the first time—at least accord­ing to The New York Times. I’ve heard that this is now being con­test­ed. But Germany used their anti-hate speech laws to shut down a left-wing extrem­ist” website.

And so we see that once these rules are put in place they can be used against any­one, not just against the peo­ple whose ide­ol­o­gy we vehe­ment­ly oppose, but also against the rest of us. And I’ve been cen­sored by these plat­forms, too. Not par­tic­u­lar­ly in this case, but nev­er­the­less it hap­pens to so many people.

And then of course I want to bring this up, too, because I think that this is a real­ly inter­est­ing per­spec­tive that I’ve only been recent­ly intro­duced to over the past cou­ple of years. There’s a researcher called Sarah T. Roberts; she spoke at re:publica in Berlin this year. And her work has been look­ing at the labor of con­tent mod­er­a­tion. She’s par­tic­u­lar­ly inter­est­ed in this as a labor prac­tice and the way the con­tent mod­er­a­tors are affect­ed by the kind of mate­r­i­al that they have to look at.

So, what I mean is that you’ve got peo­ple, indi­vid­u­als, humans, who are work­ing in out­sourced com­pa­nies in places like India and the Philippines and a num­ber of dif­fer­ent coun­tries over the world. They’re also in Germany and oth­er Western democ­ra­cies. And these are the peo­ple who are hired to look at the things that you don’t want to see. So when we say we don’t want to see hate speech on these plat­forms, we for­get about the invis­i­ble labor that goes on behind that. The peo­ple who have to look at every behead­ing post, every child sex­u­al abuse image, every porno­graph­ic image, every bit of white suprema­cist con­tent. Just because we don’t see it does­n’t mean that there aren’t a bunch of labor­ers work­ing qui­et­ly, silent­ly, look­ing at this con­tent all day long and mak­ing deter­mi­na­tions about it.

This was just an arti­cle that I found real­ly inter­est­ing. But this is one of the prob­lems of these com­pa­nies, is that they’re unde­mo­c­ra­t­ic. These are com­pa­nies where—you know, they’re corporations—where indi­vid­u­als are mak­ing deci­sions at the high­est lev­el with­out any input from the pub­lic and decid­ing what is offen­sive and what’s not offensive. 

And so on this par­tic­u­lar case I’m not going to explain camel toe.” If you don’t know it, please just Google it. Don’t make me say this on cam­era. But nev­er­the­less, what this arti­cle is par­tic­u­lar­ly about is Facebook imple­men­ta­tion guide­lines a few years ago were leaked. And they’ve actu­al­ly been leaked more recent­ly, too, if you saw The Guardian’s Facebook Files. And what that meant was in this case up they were leaked and what they showed was that camel toes” (and again, Google it please) are more offen­sive accord­ing to those guide­lines than crushed heads.”

We’ve seen oth­er cas­es— ProPublica, a won­der­ful non­prof­it media orga­ni­za­tion that reports on this, also found that white men” are a pro­tect­ed cat­e­go­ry on Facebook, more­so than black chil­dren. I’m sor­ry that I don’t have that par­tic­u­lar slide in here but I rec­om­mend look­ing for that arti­cle. It was real­ly inter­est­ing because it con­tained dif­fer­ent slides that demon­strat­ed how Facebook par­tic­u­lar­ly puts togeth­er who is pro­tect­ed and who is not pro­tect­ed. And in that case, because one group is… It’s com­pli­cat­ed. Again, I encour­age you to read the arti­cle. But in that case they found basi­cal­ly that the imple­men­ta­tion guide­lines for these work­ers were to pro­tect white men over oth­er groups.

And so if we agree… And you don’t have to agree. I under­stand that some peo­ple are in favor of cen­sor­ship, and to those peo­ple I would say I please, implore you to at the very least con­sid­er the fact that there is no due process and there is no trans­paren­cy in these cor­po­ra­tions. And please, if you are in fact in favor of cen­sor­ing hate speech, at the very least ensure that you are fight­ing for due process, because of all of these indi­vid­u­als who are cen­sored and should­n’t be.

But, if like me you don’t think that cen­sor­ship can be imple­ment­ed fair­ly, then we have to start think­ing about what else we can do about hate­ful speech. And this is where I fear that I don’t have all the answers. And I think that this is some­thing that I would love to con­tin­ue the con­ver­sa­tion with all of you over the next cou­ple of days.

So what can we do about hate­ful speech? I think the first thing is coun­ter­speech. And I know that that is not a good enough solu­tion for a lot of peo­ple. I’ve said this many times but a lot of peo­ple push back and say, Yes, but coun­ter­speech, not every­one can do that.” It takes priv­i­leged to be able to stand up in the face of a white suprema­cist and say this or that. And so we need to build com­mu­ni­ties togeth­er, to pro­tect each oth­er to fight back against hate speech.

And the first thing there in build­ing com­mu­ni­ty is that we need to think about love. And I know that that sounds crazy, and I know that it sounds insuf­fi­cient. But at the same time you know, I gave this the title for a rea­son, Loving Out Loud in a Time of Hate Speech.” I spend a lot of my day on the Internet, and I spend a lot of time on Twitter, and I’ve seen my own heart rate rise. I’ve seen hate brew­ing up inside of myself. And I’ve been angry, so angry, over the past year—particularly since November when my coun­try elect­ed an idiot. Not just an idiot but prob­a­bly a fascist. 

And so, I think that this is where we have to con­sid­er love. We have to think about what’s hap­pen­ing inside our own heads—the anger that we’re all build­ing toward. We have to think about our com­mu­ni­ties and how we can band togeth­er to fight back against hate speech, and to edu­cate the rest of the peo­ple around us, and par­tic­u­lar­ly the peo­ple younger than us, about what this means. And I don’t just mean edu­cate peo­ple to speak back but edu­cate peo­ple about the his­to­ries that led to this point.

Like I said at the very begin­ning of my talk, a young boy in my home state was lynched this week. Now, I’m from New Hampshire in the United States. Many of you prob­a­bly don’t know it. That’s okay, it’s not very notable for a num­ber of rea­sons. But, it’s 97% white, and I’m not sur­prised that these young boys did­n’t know what the his­to­ry that came before them was, because we weren’t taught in school. I bare­ly learned about the Holocaust let alone the his­to­ry of white suprema­cy in my own coun­try. And so I encour­age you if you’re from a place like I’m from… I know many of you are prob­a­bly from cities and for that maybe you’re lucky. But nev­er­the­less it’s impor­tant to edu­cate the peo­ple around us.

Two women marching, holding a sign reading "love trumps hate"

And again, I hate that word in the mid­dle. It’s hard for me to even say it. But love does stand up against hate and we have to remem­ber that in every sin­gle one of our interactions. 

And so, I know that that’s not the entire solu­tion to hate speech. I know that there’s a lot more that we need to talk about. But I want you to con­sid­er the effect that cen­sor­ship has when we apply it to hate speech and when we allow these unde­mo­c­ra­t­ic com­pa­nies to make these deci­sions for us with­out our input what that means. And so keep calm, and love not hate. Thank you very much and I’m hap­py to talk about this more with than any of you over the next cou­ple days. Thanks.

Further Reference

Session descrip­tion