Introducer: Our next talk will be tack­ling how social media com­pa­nies are cre­at­ing a glob­al moral­i­ty stan­dard through con­tent reg­u­la­tions. This will be pre­sent­ed by [the] two per­sons stand­ing here, the dig­i­tal rights advo­cate Matthews Stender, and writer and activist Julian C York. Please give them a warm applause.

Matthew Stender: Hello every­body. I hope you had a great Congress. Thank you for being here today. We’re almost wrapped up with with the con­gress, but we appre­ci­ate you being here. My name is Matthew Stender. I am a com­mu­ni­ca­tion strate­gist, cre­ative direc­tor, and dig­i­tal rights advo­cate focus­ing on pri­va­cy, social media, cen­sor­ship, and free­dom of the press, and expres­sion.

Jillian C York: And I’m Jillian York and I work at the Electronic Frontier Foundation, where I work on pri­va­cy and free expres­sion issues, as well as a few oth­er things. I’m based in Berlin. [applause] Thank you. For Berlin? Awesome. I hope to see some of you there.

So today we’re going to be talk­ing about sin in the time of tech­nol­o­gy, and what we mean by that is the way in which cor­po­ra­tions, par­tic­u­lar­ly con­tent plat­forms and social media plat­forms are, dri­ving moral­i­ty and our per­cep­tion of it. We’ve got three key take­aways to start off with.

The first is that social media com­pa­nies have an unpar­al­leled amount of influ­ence over our mod­ern com­mu­ni­ca­tions. This we know. I think this is prob­a­bly some­thing every­one in this room can agree on.

These com­pa­nies also play a huge role in shap­ing our glob­al out­look on moral­i­ty and what con­sti­tutes it. So the ways in which we per­ceive dif­fer­ent imagery, dif­fer­ent speech, is being increas­ing­ly defined by the reg­u­la­tions that these plat­forms put upon us [in] our dai­ly activ­i­ties on them.

And third they’re entire­ly unde­mo­c­ra­t­ic. They’re behold­en to share­hold­ers and gov­ern­ments, but not at all to the pub­lic; not to me, not to you. Rarely do they lis­ten to us, and when they do there has to be a fair­ly excep­tion­al amount of pub­lic pres­sure on them. So that’s that’s our start­ing point. That’s what we want to kick off with. I’ll pass the mic to Matthew.

One strong hint is buried in the fine print of the close­ly guard­ed draft. The pro­vi­sion, an increas­ing­ly com­mon fea­ture of trade agree­ments, is called Investor-State Dispute Settlement,” or ISDS. The name may sound mild, but don’t be fooled. Agreeing to ISDS in this enor­mous new treaty would tilt the play­ing field in the United States fur­ther in favor of big multi­na­tion­al cor­po­ra­tions.
Elizabeth Warren, The Trans-Pacific Partnership clause every­one should oppose”

Stender: So think­ing about these three take­aways, I’m going to bring it kind of top-level for a moment, to intro­duce an idea today, which some peo­ple have talked about, the idea of the rise of the tech­no class. Probably a lot of peo­ple in this room have fol­lowed the nego­ti­a­tions leaked in part and then in full by WikiLeaks about the Trans-Pacific Partnership, the TPP. What some peo­ple have men­tioned dur­ing this debate is the idea of a cor­po­rate cap­ture, a world now in which the cor­po­ra­tions are becom­ing, are matur­ing to the to the extent in which they can now sue gov­ern­ments, that the multi­na­tion­al reach of many cor­po­ra­tions [is] larg­er than that, the diplo­mat­ic reach of coun­tries. And with social media plat­forms being part of this, the social media com­pa­nies now are going to have the capac­i­ty to influ­ence not only cul­tures but peo­ple with­in cul­tures, and how they com­mu­ni­cate with peo­ple inside their cul­ture, and com­mu­ni­cate glob­al­ly. So as activists and tech­nol­o­gists, I would like to pro­pose at least that we start think­ing about and beyond the prod­uct and ser­vice offer­ings of today’s social media com­pa­nies and start look­ing ahead to two, five, ten years down the road, in which these com­pa­nies may have social media ser­vices and social media ser­vice offer­ings which are indis­tin­guish­able from today’s ISPs and tel­cos and oth­er things. And this is real­ly to say that social media is mov­ing past the era of the walled gar­den into neo-empires.

One of the things that’s on this slide are some head­lines about dif­fer­ent deliv­ery mech­a­nisms in which the social media com­pa­nies and peo­ple like Elon Musk are look­ing to roll out and almost leapfrog, if not com­plete­ly leapfrog, the exist­ing tech­nolo­gies of ter­res­tri­al broad­cast­ing, fiber optics, these sort of things. So now we’re look­ing at a world in which Facebook is now going to have drones, Google is look­ing into bal­loons, and oth­er peo­ple are look­ing into low Earth orbit satel­lites, to be able to pro­vide direct­ly to the end con­sumer, to the user, to the hand­set, the con­tent which flows through these net­works.

So one of the first things I believe we’re gonna see in this field is free basics. Facebook has a ser­vice (it was launched as Internet​.org and now has been rebrand­ed to Free Basics), and why this is inter­est­ing is that in one hand Free Basics is a free ser­vice that is try­ing to get peo­ple that are not on the Internet now to use Facebook’s win­dow to the world. It has maybe a cou­ple dozen sites that are acces­si­ble. It runs over the data net­works for coun­tries. Reliance, a telecom­mu­ni­ca­tions com­pa­ny in India is one of the larg­er tel­cos, but not the largest. There’s a lot of pres­sure that Facebook is putting on the gov­ern­ment of India right now to be able to have this ser­vice offered across the coun­try.

One of the ways that this is prob­lem­at­ic is because a lim­it­ed num­ber of web­sites flow through this, that peo­ple that to get exposed Free Basic, this might be their first time see­ing the Internet in some cas­es. An exam­ple that’s inter­est­ing think about is a lion born into a zoo. Perhaps evo­lu­tion and oth­er things may have this lion dream of per­haps run­ning wild on the plains of Africa, but at the same time it will nev­er know that world. Facebook Free Basic users, know­ing Facebook’s win­dow of the Internet, may not all jump over to a full data pack­age on their ISP, and many peo­ple may be stuck in Facebook’s win­dow to the world.

Meanwhile, the post­ings, pages, likes, and friend requests of mil­lions of polit­i­cal­ly active users have helped to make Zuckerberg and col­leagues very rich. These peo­ple are increas­ing­ly unhap­py about the man­ner in which Facebookistan is gov­erned and are tak­ing action as the stakes con­tin­ue to rise on all sides.
Rebecca MacKinnon, Ruling Facebookistan”

York: In oth­er words, we we’ve reached an era where these com­pa­nies have, as I’ve said, unprece­dent­ed con­trol over our dai­ly com­mu­ni­ca­tions. Both the infor­ma­tion that we can access, and the speech and imagery that we can express to the world and to each oth­er. So the post­ings and pages and friend requests of mil­lions of polit­i­cal­ly active users as well have helped to make Mark Zuckerberg and his col­leagues, as well as the peo­ple at Google and Twitter and all these oth­er fine com­pa­nies extreme­ly rich. And yet we’re push­ing back. In this case I’ve got a great quote from Rebecca MacKinnon where she refers to Facebook as Facebookistan, and I think that that is an apt exam­ple of what we’re look­ing at. These are cor­po­ra­tions but they’re not behold­en at all to the pub­lic, as we know, and instead they kind of turned into these quasi-dictatorships that dic­tate pre­cise­ly how we behave on them.

The U.S. Internet mar­ket remains too big to ignore, as a result of which prod­ucts are typ­i­cal­ly tai­lored to suit the speech norms of this mar­ket and have been tai­lored in this man­ner for almost two decades.
Ben Wagner, Governing Internet Expression: How Public and Private Regulation Shape Expression Governance”, Journal of Information Technology and Politics, Volume 10, Issue 4, 2013

I also want­ed to throw this one up to talk a lit­tle bit about the glob­al speech norm. This is from Ben Wagner, who’s writ­ten a num­ber of pieces on this but who can of course the the con­cept of a glob­al speech stan­dard, which is what these com­pa­nies have begun and and are increas­ing­ly impos­ing upon us. This glob­al speech stan­dard is essen­tial­ly cater­ing to every­one in the world, try­ing to make every user in every coun­try and every gov­ern­ment hap­py, but as a result of kind of tam­pered down free speech to the this very basic lev­el that makes both got the gov­ern­ments of let’s say the United States and Germany hap­py, as well as the gov­ern­ments of coun­tries like Saudi Arabia. Therefore we’re look­ing at real­ly kind of the low­est com­mon denom­i­na­tor when it comes to some types of speech, and this sort of flat, grey stan­dard when it comes to oth­ers.

Stender: So as Jillian just men­tioned, we have coun­tries in play. Facebook and oth­er orga­ni­za­tions, social media com­pa­nies are try­ing to piv­ot and play with an inter­na­tion­al field, but let’s just take for a moment a look at the scale and scope and size of these social media com­pa­nies.

I just pulled some fig­ures from the Internet, and with some lat­est cen­sus infor­ma­tion. We have China, 1.7 bil­lion peo­ple. India, 1.25 bil­lion peo­ple. 2.2 bil­lion indi­vid­u­als and prac­tice of Islam and Christianity. But now we have Facebook with, accord­ing to their sta­tis­tics, 1.5 bil­lion active month­ly users. Their sta­tis­tics; I’m sure many peo­ple here would like to dis­pute these num­bers, but at the same time these plat­forms are now large. I mean not larg­er than some of the reli­gions, but Facebook has more month­ly active users than China or India have cit­i­zens. So we’re not talk­ing about base­ment star­tups, we’re now talk­ing about com­pa­nies with the size and scale to be able to real­ly be influ­en­tial in a larg­er insti­tu­tion­al way.

Magna Carta, the US Constitution, the Declaration of Human Rights, the Treaty of Maastricht, the Bible, the Koran. These are time-tested, at least long-standing, prin­ci­pal doc­u­ments that place upon their con­stituents, whether it be cit­i­zens or spir­i­tu­al adher­ents, a cer­tain code of con­duct. Facebook, as Jillian men­tioned, men­tioned is non-democratic. Facebook’s terms and stan­dards were writ­ten by a small group of indi­vid­u­als with a few com­pelling inter­ests in mind, but we are now talk­ing about 1.5 bil­lion peo­ple on a month­ly basis that are sub­servient to a terms of ser­vice which they had no input on.

So to piv­ot from there and bring it back to spir­i­tu­al­i­ty, why is this impor­tant? Well, spir­i­tu­al moral­i­ty has always been a place for reli­gion. Religion has a monop­oly on the soul, you could say. Religion is a set of rules which if you obey, you are able to not go to hell or have an after­life, [be] rein­car­nat­ed, what­ev­er the reli­gious prac­tice may be. Civil moral­i­ty is quite inter­est­ing in the sense that the sov­er­eign state as a top-level insti­tu­tion has the abil­i­ty to put into place a series of statutes and reg­u­la­tions, the vio­la­tion of which can send you to jail. Another inter­est­ing note is that the state also also has a monop­oly on the use of sanc­tioned vio­lence. Say that the offi­cial actors of the states are able to do things which the cit­i­zens of that state may not. And if we take a look at this con­cept of dig­i­tal moral­i­ty I spoke about ear­li­er, with ser­vices like Free Basic intro­duc­ing new indi­vid­u­als to the Internet, well by a vio­la­tion of the terms of ser­vice, you can be exclud­ed from from these mas­sive glob­al net­works. And Facebook is active­ly try­ing to cre­ate if not a monop­oly, a semi-monopoly on glob­al con­nec­tiv­i­ty in a lot of ways.

So what what dri­ves Facebook? And this is a few things. One is a pro­tec­tion­is­tic legal frame­work. The con­trol of copy­right vio­la­tions is some­thing that a lot of plat­form stomped out pret­ty ear­ly. They don’t want to be sued by the RIAA or the MPAA, and so there were mech­a­nisms in which copy­right­ed mate­r­i­al was able to be tak­en out of the plat­form. They also lim­it poten­tial com­pe­ti­tion. And I think this is quite inter­est­ing in the sense that they’ve shown this in two ways. One, they’ve pur­chased rival or poten­tial com­peti­tors. You see this with Instagram being bought by Facebook. But Facebook has also demon­strat­ed the abil­i­ty or shown the will­ing­ness to cen­sor cer­tain con­tent. tsu​.co is a new social site, and men­tions and links to this plat­form were delet­ed or not allowed on Facebook. So even using Facebook as a plat­form to talk about anoth­er plat­form was not allowed. And then a third com­po­nent is the oper­a­tion on a glob­al scale. It’s not only the size of the com­pa­ny but it’s also about the glob­al reach. Facebook main­tains offices around the world, as oth­er social media com­pa­nies do. They engage in pub­lic diplo­ma­cy, and they also oper­ate in many coun­tries and many lan­guages.

So just to take it to com­pa­nies like Facebook for a moment. If we’re look­ing at eco­nom­ics, you have the tra­di­tion­al twentieth-century multi­na­tion­als, Coca-Cola, McDonald’s. The goal for the end user of these prod­ucts was con­sump­tion. This is chang­ing now. Facebook is look­ing to cap­ture more and more parts of the sup­ply chain, and as a ser­vice provider, as a con­tent mod­er­a­tor, and respon­si­ble for nego­ti­at­ing and adju­di­cat­ing the con­tent dis­putes. At the end of the day users are real­ly the prod­uct. It’s not for us Facebook users, the plat­form, it’s real­ly for adver­tis­ers. And [if] we take a hier­ar­chy of the plat­form, we have the cor­po­ra­tion, adver­tis­ers, and then users kind of at the fringes.

York: So let’s get into the nit­ty grit­ty a lit­tle bit about what con­tent mod­er­a­tion on these plat­forms actu­al­ly looks like. I’ve put up two head­lines from Adrian Chen, a jour­nal­ist who wrote these for Gawker and Wired respec­tive­ly. They’re both a cou­ple years old. But what he did was he inves­ti­gat­ed who was mod­er­at­ing the con­tent on these plat­forms. And what he found and accused these com­pa­nies of is out­sourc­ing their con­tent mod­er­a­tion to low-paid work­ers in devel­op­ing coun­tries. In the first arti­cle I think Morocco was the coun­try, and I’m going to show a slide from that a bit, of what those con­tent mod­er­a­tors worked with. And the sec­ond arti­cle talked a lot about the use of work­ers in the Philippines for this pur­pose. We know that these work­ers are prob­a­bly low-paid. We know that they’re giv­en very very min­i­mal time frame to look at the con­tent that they’re being pre­sent­ed.

So here’s how it basi­cal­ly works across plat­forms, with small dif­fer­ences: I post some­thing. (And I’ll show you some great exam­ples of things I post­ed lat­er.) I post some­thing, and if I post it to my friends only, my friends can then report it to the com­pa­ny. If I post it pub­licly, any­body who can see it, or who’s a user of the prod­uct can report it to the com­pa­ny. Once a piece of con­tent is report­ed, a con­tent mod­er­a­tor then looks at it and with­in that very small time frame (we’re talk­ing half a sec­ond to two sec­onds, prob­a­bly, based on the inves­tiga­tive research that’s been done by a num­ber of peo­ple) they have to decide if this con­tent fits the terms of ser­vice or not.

Now, most of these com­pa­nies have a legal­is­tic terms of ser­vice as well as a set of com­mu­ni­ty guide­lines or com­mu­ni­ty stan­dards which are clear to the user, but they’re still often very vague, and so I want to get into a cou­ple of exam­ples that show that.

An extensive list of guidelines for evaluating content for abuse moderation

Image: Gawker

This slide is one of the exam­ples that I gave. You can’t see it very well, so won’t leave it up for too long, but that was what con­tent mod­er­a­tors at this out­source com­pa­ny, oDesk, were alleged­ly using to mod­er­ate con­tent on Facebook.

This next pho­to con­tains nudi­ty.

Image: Paper

I think every­one prob­a­bly knows who this is and has seen this pho­to. Yes? No? OK. This is Kim Kardashian, and this pho­to alleged­ly broke the Internet. It was a pho­to tak­en for Paper Magazine. It was post­ed wide­ly on the Web, and it was seen by many many peo­ple. Now, this pho­to­graph def­i­nite­ly vio­lates Facebook’s Terms of Service, buu­ut Kim Kardashian’s real­ly famous and makes a lot of mon­ey, so in most instances as far as I could tell, this pho­to was total­ly fine on Facebook.

We restrict the dis­play of nudi­ty because some audi­ences with­in our glob­al com­mu­ni­ty may be sen­si­tive to this type of con­tent — par­tic­u­lar­ly because of their cul­tur­al back­ground or age.
Facebook Comunity Standards, accessed January 13, 2016, (bold­ing added)

Now let’s talk about those rules a lit­tle bit. Facebook says that they restrict nudi­ty unless it is art. So they do make an excep­tion for art which may be why they allowed that image of Kim Kardashian’s behind to stay up. Art is defined by the indi­vid­ual. And yet at the same time they make clear that let’s say a pho­to­graph of Michelangelo’s David or a pho­to­graph of anoth­er piece of art in a muse­um would be per­fect­ly accept­able, where­as your sort of aver­age nudi­ty maybe prob­a­bly is not going to be allowed to remain on the plat­form.

They also note that they restrict the dis­play of nudi­ty because their glob­al com­mu­ni­ty may be sen­si­tive to this type of con­tent, par­tic­u­lar­ly because of their cul­tur­al back­ground or age. So this is Facebook, in their com­mu­ni­ty stan­dards, telling you explic­it­ly that they are ton­ing down free speech to make every­one hap­py.

This is anoth­er pho­to­graph. Germans par­tic­u­lar­ly, I’m inter­est­ed to see, is every­one famil­iar with the show The Golden Girls? Quite a few. So you might rec­og­nize her. She was Dorothy in The Golden Girls. This is the actress Bea Arthur, and this is from a paint­ing from 1991 by John Curran of her. It’s unclear whether or not she sat for the paint­ing. It’s a won­der­ful image. It’s beau­ti­ful. It’s a very beau­ti­ful por­trait of her. But I post­ed it on Facebook sev­er­al times in a week. I encour­aged my friends to report it, and in fact Facebook found this to not be art. Sorry.

Another image. This is by a Canadian artist called Rupi Kaur. She post­ed a series of images in which she was men­stru­at­ing. She was try­ing to essen­tial­ly describe the nor­mal the nor­mal­i­ty of this, the fact that this is some­thing that all women go through. Or most women go through, rather. And as a result, Instagram: denied. Unclear on the rea­sons. They told her that it had vio­lat­ed the terms of ser­vice but weren’t exact­ly clear as to why.

sexcutrix as La Fornarina, Raphael” from Webcam Venus, by Addie Wagenknecht & Pablo Garcia

And final­ly this is anoth­er one. This was by an artist friend of mine. I’m afraid that I have com­plete­ly blanked on who did this par­tic­u­lar piece, but what it was is they took famous works of nude art and had sex work­ers pose in the same pos­es as the pieces of art. I thought it was a real­ly cool project. But Google+ did not find that to be real­ly cool project, and because of their guide­lines on nudi­ty they banned it.

This is a cat. Just want­ed to make sure you’re awake. It was total­ly allowed.

Diversity” at social media com­pa­nies

Facebook: 68% men glob­al­ly. In the US, 55% white.
Twitter: 66% men glob­al­ly. In the US, 59% white.
Google: 70% men glob­al­ly. In the US, 60% white.
[pre­sen­ta­tion slide]

In addi­tion to the prob­lems of con­tent mod­er­a­tion, I’m going to say that we also have a major diver­si­ty prob­lem at these com­pa­nies. These sta­tis­tics are facts. These are from all of these com­pa­nies them­selves. They put out diver­si­ty reports recent­ly, or as I like to call them diver­si­ty” reports. The sta­tis­tics are a lit­tle bit dif­fer­ent because they only cap­ture data on eth­nic­i­ty or nation­al­i­ty in their US offices, just because of how those stan­dards are sort of odd all over the world. So the first stats refer to their glob­al staff; the sec­ond ones in each line refer to their US staff. But as you can see these com­pa­nies are large­ly made of white men, which is prob­a­bly not sur­pris­ing, but it is a prob­lem.

Now why is it a prob­lem, par­tic­u­lar­ly when you’re talk­ing about pol­i­cy teams? The peo­ple who build poli­cies and reg­u­la­tions have an inher­ent bias. We all have an inher­ent bias. But what we’ve seen in this is real­ly a bias of sort of the American style of pru­de­ness. Nudity is not allowed, but vio­lent extreme vio­lence as long as it’s fic­tion­al is total­ly OK. And that’s gen­er­al­ly how these these plat­forms oper­ate. And so I think that when we ensure that there is diver­si­ty in the teams cre­at­ing both our tools, our tech­nol­o­gy, and our poli­cies, then we can ensure that diverse world­views are brought into those that cre­ation process, and that the poli­cies are there­fore more just.

So what can we do about this prob­lem as con­sumers, as tech­nol­o­gists, as activists, as whomev­er you might iden­ti­fy as?

The first one I think a lot of the tech­nol­o­gists are going to agree with: devel­op decen­tral­ized net­works. We need to work toward that ide­al because these com­pa­nies are not get­ting any small­er. I’m not going to nec­es­sar­i­ly go out and say that they’re too big to fail, but they are mas­sive and as Matthew not­ed ear­li­er they’re buy­ing up prop­er­ties all over the place and mak­ing sure that they do have con­trol over our speech.

The sec­ond thing is to push for greater trans­paren­cy around terms of ser­vice take­downs. Now, I’m not a huge fan of trans­paren­cy for the sake of trans­paren­cy. I think that these com­pa­nies have been putting out trans­paren­cy reports for a long time that show what coun­tries ask them to take down con­tent or hand over user data. But we’ve seen those trans­paren­cy” reports it to be incred­i­bly flawed already. So in push­ing for greater trans­paren­cy around terms of ser­vice take­downs, that’s only a first step.

The third thing is we need to demand that these com­pa­nies adhere to glob­al speech stan­dards. We already have the Universal Declaration of Human Rights. I don’t under­stand why we need com­pa­nies to devel­op their own bespoke rules. [applause] And so by demand­ing that com­pa­nies adhere to glob­al speech stan­dards, we can ensure that these are places of free expres­sion. Because it is unre­al­is­tic to just tell peo­ple to get off Facebook. I can’t tell you how many times in the tech com­mu­ni­ty over the years I’ve heard peo­ple say, Well, if you don’t like it just leave.” That’s not a real­is­tic option for many peo­ple around the world and I think we all know that deep down.

And the oth­er thing I would say there, though, is that pub­lic pres­sure works. We saw last year with Facebook’s real name pol­i­cy there were a num­ber of drag per­form­ers in the San Francisco Bay Area who were kicked off the plat­form because they were using their per­for­mance, their drag names. Which is a com­plete­ly legit­i­mate thing to do just as folks have hack­er names or oth­er pseu­do­nyms, but those folks pushed back, they formed a coali­tion, and they got Facebook to change a lit­tle bit. It’s not com­plete­ly there yet, but they’re mak­ing progress and I’m hop­ing that this goes well.

And the last thing (and this is total­ly a pitch; throw­ing that right out there), sup­port projects like ours, which I’m going to throw to Matthew to talk about, onlinecen​sor​ship​.org, and anoth­er project done by the excel­lent Rebecca MacKinnon called Ranking Digital Rights.

Stender: So just a lit­tle bit of think­ing out­side the box. onlinecen​sor​ship​.org is a plat­form that’s recent­ly launched. Users can go onto the plat­form and sub­mit. There’s a small ques­tion­naire if their con­tent has been tak­en down by the plat­forms. Why we think this is excit­ing is because right now, as Jillian men­tioned, that trans­paren­cy reports are are fun­da­men­tal­ly flawed, we are look­ing to crowd­source infor­ma­tion about the ways in which six social media com­pa­nies are mod­er­at­ing and tak­ing down con­tent. Because we can’t know exact­ly their account­abil­i­ty and trans­paren­cy in real time, we’re hop­ing to be able to find trends both across the kind of con­tent that has been tak­ing down, geo­graph­ic trends, news relat­ed trends, with­in sort of self-reported con­tent take­down. But it’s plat­forms like these I think that hope­ful­ly will begin to spring up in response, for the com­mu­ni­ty to be able to put tools in place, that peo­ple can be a part of the report­ing and trans­paren­cy ini­tia­tive.

York: We launched about a month ago, and we’re hop­ing to put our first set of reports around March.

It is rea­son­able that we press Facebook on these ques­tions of pub­lic respon­si­bil­i­ty, while also acknowl­edg­ing that Facebook can­not be all things to all peo­ple. We can demand that their design deci­sions and user poli­cies be explic­it, thought­ful, and open to pub­lic delib­er­a­tion. But the choic­es Facebook makes, in their design and in their poli­cies, are val­ue judg­ments.
Jessa Lingel & Tarleton Gillespie, One Name to Rule Them All: Facebook’s Identity Problem”

And final­ly, I just want to close with one more quote before we slip into Q&A, and that is just to say that it’s rea­son­able that we press Facebook on these ques­tions of pub­lic respon­si­bil­i­ty while also acknowl­edg­ing that Facebook can­not be all things to all peo­ple. We can demand that their design deci­sions and user poli­cies be explic­it, thought­ful and open to pub­lic delib­er­a­tion. But, and this is the most impor­tant part in my view, the choic­es that Facebook makes in their design and in their poli­cies are val­ue judg­ments. This is polit­i­cal, and I know you’ve heard that a lot of talks. So have I. But I think we can’t we can­not for­get that this is all polit­i­cal, and we have to address it as such. And for some­one, if that means quit­ting the plat­form, that’s fine too, but I think that we should still under­stand that our friends, our rel­a­tives, our fam­i­lies, are using these plat­forms and that we do owe it to every­body to make them a bet­ter place for free expres­sion and pri­va­cy.

Thank you.


Discussion

Audience 1: You just addressed it. I'm sort of, especially after listening to your talk, I'm sort of on the verge of quitting Facebook or starting to. I don't know. And I agree it's a hard decision. I've been on Facebook for I think six years now, and it is a dispute for me, myself. So I'm in this very strange position and now I have to decide what to do. Is there any help for me out there that takes my my state and helps me deciding? Or I don't know. It's strange.

York: That's such a hard question. I'll put on my privacy had for just a second and and say what I would say to people when they're making that consideration from a privacy viewpoint, because I do think that the implications of privacy on these platforms is often much more severe than those of speech. But this is what I do. So in that case I think it's really about understanding your threat model. It's understanding what sort of threat you're under when it comes to the data collection that these companies are undertaking, as well as the censorship, of course. But I think it really is a personal decision and I'm sure that there are great resources out there around digital security and around thinking through those threat model processes, and perhaps that could be of help to you for that. I don't know if you want to add…

Stender: I think it's one of these big toss-ups. This is a system which many people are connected through, even sometimes email addresses roll over, so I think it's the opportunity cost. By leaving a platform, what do you have to lose, what you have to gain? But it's also important to remember how the snapshot we see a Facebook now, it's probably not going to get better. It's probably going to be more invasive and coming into different parts of our lives. So I think from the security and privacy aspect, it's really just up to the individual.

Audience 1: Short follow-up, if I'm allowed to. The main point for me is not my personal implications. So I am quite aware that Facebook is a bad thing and I can leave it. But I'm sort of thinking about it's way past the point where we can decide on our own and decide, "Okay, is it good for me, or is it good for my friend, or is it good for my mom and my dad or whatever?" We have to think about is Facebook as such a good thing for the society, as you're addressing. So I think we have to drive this decision-making from one person to a lot lot lot of persons.

York: I agree. And I'll note what we're talking about and the project that we're working on together is a small piece of the broader issue, and I agree that this needs to be tackled from many angles.

Audience 2: One of the questions from the internet is, aren't the moderators the real problem, who ban everything which they don't really like, rather than the providers of service?

York: I would say that the content moderators, we don't know who they are. So that's part of the issue. We don't know, and I've heard many allegations over the years when certain content's been taken down in a certain local or cultural context. Particularly in the Arab world, I've heard the accusation that, "Oh, those content moderators are pro-Sisi," (the dictator in Egypt) or whatever. I'm not sure how much merit that holds, because like I said we don't know who they are. But what I would say is that it doesn't feel like they're given the resources to do their jobs well. So even if they were the best, most neutral people on earth, they're given very little time, probably very little money, and not a whole lot of resources to work with in making those determinations.

Audience 3: First off, thank you so much for the talk. I just have a basic question. It seems logical that Facebook is trying to put out this mantra of "protect the children." I can kind of get behind that. And it also seems, based on the fact that the have the "real names policy," that they would also expect you put in a real legal age. So if they're trying to censor things like nudity, why couldn't they simply use things like age as criteria to protect children from nudity, while letting everyone else who is above the legal age make their own decision?

York: Do you want to take that?

Stender: I think it's a few factors. One is on the technical side, what constitutes nudity? And in a process way, if it does get flagged, do you have some boxes to say what sort of content? I could see a system in which content flagged as nudity gets referred to a special nudity moderator, and then if the moderator says yes this is nudity then filter all less than legal age or whatever age. But I think it's part of a broader, more systematic approach by Facebook. It's the broad strokes. It's really kind of dictating this digital morality baseline, and saying, "No. Anybody in the world cannot see this. These are our hard lines, and it doesn't matter what age you are or where you reside, this is the box which we are placing you in, and content that falls outside of this box, for anybody regardless of age or origin, this is what we say you can see, and anything falls out [of] that, you risk having your account suspended." So I think it's a mechanism of control.

Further Reference

This presentation page at the CCC media site, with downloads available.


Help Support Open Transcripts

If you found this useful or interesting, please consider supporting the project monthly at Patreon or once via Square Cash, or even just sharing the link. Thanks.