Lindsay Blackwell: Hi every­body, my name is Lindsay Blackwell. I’m a PhD can­di­date at the University of Michigan Social Media Lab, which is in the School of Information. And I am here today to talk to you about HeartMob.

HeartMob is a pri­vate online com­mu­ni­ty designed to pro­vide sup­port for peo­ple expe­ri­enc­ing online harass­ment. When I talk about online harass­ment I’m refer­ring to a very broad spec­trum of abu­sive behav­iors that are enabled by tech­nol­o­gy plat­forms and used to tar­get a user or a group of users. So this can be any­thing from a flam­ing or the use of per­son­al insults or inflam­ma­to­ry lan­guage, to things like dox­ing or reveal­ing or broad­cast­ing per­son­al infor­ma­tion about some­one such as a phone num­ber or address, to things like stalk­ing and imper­son­ation and things of that nature. 

So just to keep that in mind, this is not an uncom­mon expe­ri­ence. We know from Pew that 41% of adult American Internet users have per­son­al­ly expe­ri­enced online harass­ment. We also know that near­ly two thirds of American adults have wit­nessed online harass­ment in their feed. So this is some­thing that’s hap­pen­ing a lot. 

Women, peo­ple of col­or, LGBT peo­ple, and young peo­ple are sig­nif­i­cant­ly more like­ly to expe­ri­ence online harass­ment than their counterparts. 

And online harass­ment has seri­ous impacts both on peo­ple’s online and offline lives. So this is not just an online prob­lem. People do report chang­ing their online pri­va­cy behav­iors, per­haps leav­ing social media sites alto­geth­er when they have an expe­ri­ence with harass­ment. But they also report dis­rup­tions of their offline lives, emo­tion­al and phys­i­cal dis­tress, increased pri­va­cy con­cerns, and also dis­trac­tions from per­son­al lives and work oblig­a­tions due to the time that’s required, the emo­tion­al and phys­i­cal labor that’s required, to man­age harass­ment when you’re expe­ri­enc­ing it. So if you’re being harassed by hun­dreds of peo­ple and you want the plat­form that that’s hap­pen­ing on to inter­vene, it’s your respon­si­bil­i­ty to go through and report all of those com­ments, which takes a lot of time and a lot of effort. 

No one knows that bet­ter than the lead­ers of Hollaback. Hollaback is an advo­ca­cy orga­ni­za­tion ded­i­cat­ed to end­ing harass­ment in pub­lic spaces, name­ly street harass­ment. And because of the nature of the work that the lead­ers of this orga­ni­za­tion do, they’ve suf­fered con­sis­tent and severe online harass­ment. So, they took their col­lec­tive expe­ri­ence in inter­sec­tion­al fem­i­nist prac­tice, bystander inter­ven­tion, and social movement-building, and they applied that exper­tise to cre­at­ing an online space, an online com­mu­ni­ty, for peo­ple who are also expe­ri­enc­ing online harass­ment. And they did that with the help of a large group of peo­ple who were fre­quent tar­gets of online harass­ment them­selves. So they real­ly tried to design a com­mu­ni­ty that would serve the needs of the peo­ple who are most vul­ner­a­ble to these types of behaviors. 

And what result­ed is a sys­tem that looks like this. So if you go to HeartMob, which is iheart​mob​.org, as some­one who’s expe­ri­enc­ing harass­ment you’re told that you’re not alone and you have a few options. So you can tell your sto­ry. You can write a descrip­tion of what’s hap­pened to you. You can select allies; so you can choose peo­ple that you know, who you want help from. Or you can rely on the broad­er HeartMob community. 

And then you can ask for spe­cif­ic types of sup­port. So you can ask for sup­port­ive mes­sages if that’s what you’d like. You can ask for help report­ing and doc­u­ment­ing abuse to plat­forms, so that that labor is more dis­persed. You can also ask for resources and instru­men­tal support. 

And if you have not expe­ri­enced harass­ment, or you just want to be help­ful to oth­ers who have, you can sign up to join HeartMob as a HeartMobber. And if you choose to join the com­mu­ni­ty you can answer those requests for sup­port. And you’re shown on the home­page, after it says these peo­ple could use your help right now, it shows actu­al cas­es that have been post­ed recent­ly of peo­ple who are expe­ri­enc­ing harass­ment and the kind of help that they’ve asked for. 

So with the help of my col­lab­o­ra­tors, we con­duct­ed semi-structured inter­views with eigh­teen users of HeartMob. Eleven of those users had expe­ri­enced harass­ment them­selves, although not all eleven had actu­al­ly post­ed a case to HeartMob. Seven of our par­tic­i­pants were sim­ply bystanders; they just want­ed to be there to help. All of our par­tic­i­pants lived in the US or in Western Europe, and our inter­views were con­duct­ed in English. 

And we had three major find­ings from this research. The first is that label­ing expe­ri­ences as online harass­ment is huge­ly val­i­dat­ing for tar­gets’ expe­ri­ences. So as an exam­ple, when a user sub­mits their case on HeartMob in that side that I showed you, they sort of check a few box­es. So they say what plat­form did this hap­pen on? What do you think moti­vat­ed this attack? Was it racist harass­ment, was it misog­y­nist harass­ment? And then a human mod­er­a­tor on HeartMob actu­al­ly goes through and reviews every sin­gle case. 

And get­ting that sig­nal that said your case was approved and we’re rec­og­niz­ing this as online harass­ment was very pow­er­ful for peo­ple. One par­tic­i­pant said 

It’s the safe­ty net. Tight now, the worst that can hap­pen is some­one expe­ri­ences harass­ment and they have nowhere to go—that’s nor­mal in online com­mu­ni­ties. But with HeartMob, if some­one says they’re expe­ri­enc­ing harass­ment, then at least they get heard… At least they have an oppor­tu­ni­ty to have oth­er peo­ple sym­pa­thize with them.
[slide]

We also heard from many par­tic­i­pants that this expe­ri­ence on HeartMob of hav­ing their harass­ment expe­ri­ence val­i­dat­ed was much dif­fer­ent than the expe­ri­ence they had on major social media sites like Facebook and Twitter who often rely—because of the num­ber of users that they have and the scale of their mod­er­a­tion prac­tices, they often rely on canned respons­es that don’t rec­og­nize indi­vid­ual expe­ri­ences or the impacts of harass­ment. One par­tic­i­pant said

What I think was real­ly frus­trat­ing was the lev­el of what peo­ple could say and not be con­sid­ered a vio­la­tion of Twitter or Facebook poli­cies. If they’re just like, You should shut up and keep your legs togeth­er, whore,” that’s not a vio­la­tion because they’re not actu­al­ly threat­en­ing me. [Blackwell: That’s true.] It’s com­pli­cat­ed and frus­trat­ing, and it makes me not inter­est­ed in using those platforms.
[slide]

Finally, a par­tic­i­pant said It does­n’t have the capac­i­ty to sin­gle­hand­ed­ly solve the prob­lem, but HeartMob makes being online bearable.” 

Our sec­ond major result was that for bystanders, label­ing behav­iors as online harass­ment” enabled peo­ple to real­ly grasp the full scope of this prob­lem. So one par­tic­i­pant who actu­al­ly works with domes­tic vio­lence vic­tims said that in the work that she does, she helps peo­ple under­stand how soci­ety plays a role in our vio­lent cul­ture, but she still did­n’t real­ly have a good grasp on how preva­lent this prob­lem was online. She said 

I’ve nev­er had too much oppor­tu­ni­ty to actu­al­ly see the evi­dence on the Internet. I knew it was there. I talk about it, present about it, but actu­al­ly see­ing the hor­rif­ic things that peo­ple are see­ing and doing to oth­ers online real­ly brought that to a whole dif­fer­ent place for me.
[slide]

Another par­tic­i­pant felt that because as a HeartMobber you’re shown indi­vid­ual cas­es and very con­crete, spe­cif­ic ways that you can pro­vide help, this made online harass­ment inter­ven­tion feel like a much more man­age­able prob­lem. HeartMob is a bril­liant way of address­ing a prob­lem that I think immo­bi­lizes most peo­ple, because it seems so big and daunting—so they don’t do any­thing at all.” 

And final­ly we found that in online spaces, vis­i­bly label­ing harass­ment as inap­pro­pri­ate and unac­cept­able is crit­i­cal for ser­vic­ing com­mu­ni­ty norms about what types of behav­ior are appro­pri­ate. One par­tic­i­pant said that the expe­ri­ence of online harass­ment is 

some­thing that’s very iso­lat­ing, because it can make you feel—especially there’s mul­ti­ple peo­ple doing the harassing—like every­one would be against you…like they’re rep­re­sent­ing society.
[slide]

So when some­one’s expe­ri­enc­ing this type of mas­sive attack, they may be get­ting pri­vate indi­ca­tions of sup­port from friends, but over­whelm­ing­ly what they’re see­ing is harass­ment. And that makes it feel like that’s the norm. 

Another per­son said that if they go online to sup­port some­one, and they look at the per­son­’s oth­er mes­sages and they see oth­er peo­ple pro­vid­ing vis­i­ble sup­port, they might think, Oh that’s neat. There are oth­er peo­ple out there inter­ven­ing as well.” As we found that when peo­ple did see those vis­i­ble sig­nals of inter­ven­tion, that start­ed to shift the norm away from what was appropriate. 

Ultimately, as you can see in our results, major social media plat­forms aren’t imple­ment­ing sys­tems that address the needs of the most vul­ner­a­ble users online, the peo­ple who are expe­ri­enc­ing online harass­ment at the most vol­umes. Our research sug­gests the need for more demo­c­ra­t­ic, user-driven process­es in the the gen­er­a­tion of val­ues that under­pin these sys­tems. Ultimately, best address­ing online harass­ment is best served if we take the needs of vul­ner­a­ble users first. 

If you’d like to read the full paper it’s avail­able at lind​say​black​well​.net/​h​e​a​r​t​mob. Thank you.