Nathan Matias: So, I’ll ask just a cou­ple of ques­tions and we’ll open it up just for a cou­ple of min­utes. So one of the things I found inter­est­ing about both of your con­ver­sa­tions is that as we start to see code becom­ing a pow­er­ful force in soci­ety, we’re no longer just try­ing to change laws but we find ourselves—just as we’re cit­i­zens try­ing to encour­age the gov­ern­ment or con­gress­peo­ple to change laws—we’re now stand­ing out­side of com­pa­nies say­ing well, there’s code that affects our lives. Maybe these very sys­tems we’ve heard ear­li­er that com­pa­nies might want to keep secret for com­pli­cat­ed rea­sons, how do we think about cre­at­ing change when it comes to code that we don’t con­trol?

Ethan Zuckerman: I pub­lished a very angry piece this morn­ing in The Atlantic ask­ing real­ly the ques­tion of why Facebook doesn’t give us more con­trol over the news­feed algo­rithm. And my core argu­ment was there’s this change going on right now, it’s going to repri­or­i­tize our friends and fam­i­ly, it’s gonna depri­or­i­tize con­tent we’d paid atten­tion to.

You could imag­ine giv­ing users a check­box, or a slid­er. You know, Yes, I want this change. No, I don’t want this change. Here, let me try set­tings in between.” And a proof of con­cept, Gobo, which basi­cal­ly shows you that it’s pos­si­ble to give half a dozen slid­ers and get very dif­fer­ent pieces out of it.

It’s inter­est­ing that the only path that I can think of to try to get Facebook to make this change is either pigeon­hol­ing Facebook execs and say­ing, I real­ly think you should try this.” Or sort of nam­ing and sham­ing, and sort of going out in the press and doing this.

It’s pret­ty clear that you’re not going to get there through norms, through laws. You’re not going to go to this Congress and sort of say let’s pass laws about trans­paren­cy for social net­works. It’s also unfor­tu­nate­ly pret­ty clear that you’re prob­a­bly not going to get there through code. I wish I saw a lot of alter­na­tive projects sort of com­ing up to chal­lenge the dom­i­nance of these exist­ing monop­oly plat­forms. But these monop­o­lies are very pow­er­ful once they get a cer­tain user base.

So you do end up with this rela­tion­ship like my dear friend Rebecca MacKinnon has writ­ten about, where it feels like you’re peti­tion­ing a monarch. You’re going to a sov­er­eign and basi­cal­ly say­ing, I would like you to be a benign despot rather than just a despot.” It’s real­ly wor­ri­some that we’re this far down this line.

Matias: And that’s McKennon’s book Consent of the Networked?

Zuckerman: That’s right. So one of the things that I that I want­ed to ask Karrie in all of this is… So, it’s amaz­ing that we have Julia Angwin, God bless Julia Angwin. It’s amaz­ing that we have you and Christian, God bless you and Christian. Who should be audit­ing? Is this some­thing that we want gov­ern­ments to do? Is this some­thing that NGOs should do? And I don’t just mean online, I mean sort of in gen­er­al. If we take every­thing from the hous­ing dis­crim­i­na­tion cas­es that you were talk­ing about, all the way through these algo­rith­mic dis­crim­i­na­tion cas­es, how should we do this and what would we need to make this pos­si­ble with­in the case of these sort of monop­oly plat­forms that Nathan’s refer­ring to?

Karrie Karahalios: So, I think that’s an excel­lent ques­tion and I think it depends on the spe­cif­ic thing that you’re audit­ing. So for exam­ple, there was a real­ly nice web site awhile back that looked at trav­el. And it wasn’t orga­nized but a lot of peo­ple would go on this trav­el site and they would talk to each oth­er about when to find the best flights…in a sense they were audit­ing it togeth­er.

Zuckerman: Yeah.

Karahalios: And in this case I think that you know, every cit­i­zen has a right to be able to audit this and look at this data and put it togeth­er. Let’s say for exam­ple you want to do a col­lec­tive audit for cred­it scor­ing. That is a real­ly hard col­lec­tive audit to do. That requires lots of dif­fer­ent cit­i­zens to come togeth­er to pool their data, to give it to a trust­ed source, and have this trust­ed source actu­al­ly aggre­gate it and find some­thing with it. So in that case you need some­body who’s trust­ed, and going on your trust scale maybe it’s not the gov­ern­ment here. You know, maybe it is a third par­ty that is sup­port­ed and the gov­ern­ment will lis­ten to that will do some­thing for you there. And so it real­ly is very context‐dependent.

But a lot of the things that we’re look­ing at, specif­i­cal­ly in social media, we want every cit­i­zen to be able to do.

Zuckerman: Is that a rea­son­able task, right? So we’re unit­ed here with a group of real­ly smart, real­ly com­mit­ted cit­i­zens who are doing the work of…in some cas­es becom­ing behav­ioral sci­en­tists sort of to be able to do the work. While I would love for every­one in the world to become as smart and engaged and thought­ful as the peo­ple in this room, my expe­ri­ence is that those peo­ple are actu­al­ly fair­ly rare. Is there a way in which we need some sort of insti­tu­tion­al lay­er to sort of insure that we get there? I’m not sure that with­out a Julia that we don’t ever get an audit of these flight risk sys­tems for the sim­ple rea­son that many of the peo­ple affect­ed by these sys­tems are not nec­es­sar­i­ly in a great posi­tion to sort of take the time an the incred­i­ble amount of work that it takes to audit them.

Karahalios: Yeah. And so what we found in our work for exam­ple is that in audit­ing an algo­rithm… Let’s say we audit­ed an algo­rithm that had over a hun­dred fea­tures. It turns out that peo­ple that actu­al­ly cared about these indi­vid­ual hun­dred fea­tures were…you could count on one hand. They didn’t real­ly care so much.

It turned out most peo­ple want­ed a high‐level overview that every­thing was going okay. And that’s what every sin­gle per­son should have. And that’s why I love a lot of these plu­g­ins that ProPublica makes where you can actu­al­ly inter­ro­gate the ads that you get on your pages, and you can look at lots of dif­fer­ent things. And so the lev­el that you get from the audit I think is very dif­fer­ent from what peo­ple want. And so a com­put­er sci­en­tist or devel­op­er might want to get thou­sands of fea­tures com­ing out of it to under­stand it. But in terms of the world of algo­rithm lit­er­a­cy, most peo­ple do not want that lev­el of detail.

But the term that you said ear­li­er, con­trol,” I think that’s crit­i­cal. And could I go back for a sec­ond to address—

Zuckerman: Yeah.

Karahalios: —some­thing you said ear­li­er? So, I think con­trol real­ly is some­thing we don’t think that much about. And I love the piece that you built with the slid­ers. I think it’d be won­der­ful to have that the pow­er to see some­thing like that. And I also see that Facebook has con­trols in there. They have con­trol sometimes…sometimes they’re hard to find because they’re in like five, six dif­fer­ent places. But they’re there if you want to use them. But this issue of con­trol I wor­ry some­times can be a place­bo effect.

And the rea­son I say that is because we recent­ly con­duct­ed a study where we had con­trols sim­i­lar to the ones you had. But not exact­ly the same con­trols. So for exam­ple, social media con­trols are dif­fer­ent from con­trols on Photoshop, for exam­ple. I can click a con­trol that says make this black and white” and I can see instant­ly that it hap­pened. A social media con­trol is sub­jec­tive. A social media con­trol might say that this is more pop­u­lar. Or that this has more vital­i­ty. Or that this per­son is clos­er to you in some way.

And what we did is we con­duct­ed the study, and we had peo­ple view a con­trol pan­el that actu­al­ly worked to the best of our abil­i­ty. And we test­ed this and peo­ple said it worked. And then we showed them the saved feed, with the same con­trol pan­el but every­thing was ran­dom.

Zuckerman: Huh.

Karahalios: And it turned out that in assess­ing lev­els of sat­is­fac­tion and pref­er­ence, peo­ple pre­ferred hav­ing a con­trol pan­el to the case where they only had the feed with no con­trol pan­el. But when we com­pared the ran­dom con­trol pan­el and the fully‐functional con­trol pan­el, they were indis­tin­guish­able.

Zuckerman: Yeah, this is the but­ton on the street lights, right. You have to have, it just doesn’t have to actu­al­ly be attached to some­thing.

Karahalios: Yeah. And so, I would love to keep talk­ing to you more about this because I’m not real­ly sure where to go from here!

Zuckerman: Yeah, no. We’re doing the same analy­sis. We didn’t think to be quite as evil as giv­ing peo­ple com­plete­ly dis­con­nect­ed slid­ers, but. And the one thing we did do is we added in a tab on each post that ends up in there that says explain to me why this end­ed up in my feed,” or explain to me why this is fil­tered out.” And and we’re sort of hop­ing that that’s going to help with it.

You know, I have mul­ti­ple ques­tions in this. I ques­tion whether peo­ple want that much con­trol. I think every­one thinks they want that much con­trol. I think they will back away from using it over time.

Karahalios: Yeah. I agree.

Zuckerman: My guess is that Facebook’s set­tings are pret­ty good for a lot of peo­ple a lot of the time. I guess what I’m real­ly inter­est­ed in in all of this is…we are start­ing to find these dif­fer­ent point cas­es where these algo­rith­mic bias­es come into play. And we’re hav­ing fun sort of cel­e­brat­ing the worst of them, right? So the beau­ty con­test is…amazing. Like you know, we do an auto­mat­ed beau­ty con­test and we for­get about peo­ple of col­or. Like, that’s a bit of an over­sight. Joy Buolamwini, down in our lab, has been inter­ro­gat­ing in gen­er­al how bad­ly facial recog­ni­tion soft­ware does with peo­ple of col­or, and Joy is a very dark‐skinned woman. Family’s from Ghana.

And what she start­ed find­ing were these just com­plete­ly quo­tid­i­an exam­ples. She would try to get Snapchat to put a fil­ter on her face and it wouldn’t rec­og­nize her, and I’d show up in the frame and the fil­ter would go on me, imme­di­ate­ly.

We shouldn’t have to find the most absurd ver­sions of these and then spend enor­mous amounts of time expos­ing them. First and fore­most, com­pa­nies should be doing this with their own tech­nol­o­gy so that they are not get­ting humil­i­at­ed when we pull it out, and because it’s the right thing to do in the long run.

But if we still feel the way that Nathan sort of posits that we feel, that we don’t have any abil­i­ty except to go to the feet of the sov­er­eign and say, Oh dear please sov­er­eign, please allow black peo­ple to be seen by your algo­rithm as well,” then for me this asks ques­tions of do we need an NGO which is the algo­rith­mic audit­ing force, or as Joy puts it the Algorithmic Justice League…do we need gov­ern­ment audi­tors that sort of look at these—although you make the point that the gov­ern­ment may be the least‐trusted enti­ty that we can deal with around this. How do we go after this on a sys­temic basis?

Karahalios: I think there’s two points here. One thing that we find in our stud­ies first and fore­most that we need to address and we need more research on is why peo­ple trust these algo­rithms. And so one of the things that we found is that— So I love that you put that tab on there like why do you think this is hap­pen­ing?” When we asked peo­ple why this was hap­pen­ing, they were like, Oh…it’s the algo­rithm.” And they would come up almost always with an excuse—they would con­vince them­selves that the algo­rithm was right and they would come up with a sto­ry to describe why the algo­rithm was right.

Zuckerman: Yeah.

Karahalios: And so I think that’s one issue we need to address, in the sense that first peo­ple need to… If we want peo­ple to stand up and do some­thing, peo­ple have to start under­stand­ing that maybe it’s not an omni­scient god‐like algo­rithm. And so one step, which takes a real­ly real­ly real­ly real­ly long time, is some form of algo­rithm lit­er­a­cy. And that’s going to take…forever. I’m not gonna say that’s gonna hap­pen overnight. But it doesn’t mean we should stop try­ing to make that hap­pen.

Zuckerman: One of the great exam­ples I find with this is… So we’ve seen this in the news field, peo­ple trust algo­rith­mic news judg­ment, Google telling you this is the right news sto­ry for your query, almost two to one over human edi­tors. But then when you show peo­ple tar­get­ed ads tar­get­ed to their demo­graph­ic char­ac­ter­is­tics, most of us have had the expe­ri­ence of going, Actually I’m not6o‐year‐old Ecuadorian lla­ma farmer.” And when you can sort of remind peo­ple that it’s the same com­pa­ny that’s try­ing to sell you organ­ic lla­ma feed for your flock that’s also giv­ing you the tar­get­ed news they start even­tu­al­ly mak­ing the con­nec­tion that Oh. Well actu­al­ly, these things maybe aren’t all good, but that’s—

Karahalios: [crosstalk] Interesting—

Matias: What’s your sec­ond point, Karrie?

Karahalios: Oh, I’ll fin­ish on that and then I’ll get to the sec­ond point.

So we just did a study on ads this sum­mer. And it turns out that by hav­ing these inter­ven­tions and pro­vid­ing these inter­ven­tions for the pub­lic, we’re get­ting peo­ple to go from from believ­ing in algo­rith­mic author­i­ty to algo­rithm dis­il­lu­sion­ment.

Zuckerman: Mmm.

Karahalios: And so for exam­ple, it turns out like, I thought that because I hat­ed ads every­one hat­ed ads. I real­ly thought that. It turns out I was wrong. It turns out there’s a huge pop­u­la­tion of peo­ple that love ads. I just didn’t talk to them. Until this sum­mer. And one of things that peo­ple real­ly like about ads is if it’s some­thing that per­tains to them, or if the expla­na­tion for the ad actu­al­ly caters to some­thing that they’re proud of them­selves for. So for exam­ple one per­son said you know, It’s as [if] I’m see­ing this ad because I’m a hip mil­len­ni­al. I love that. I love this ad.” And so this expla­na­tion for it made them like they ad more.

And so one thing that I’ve had to do is change my mod­el of…just because I don’t like them oth­er oth­er peo­ple don’t like them. And maybe think about how peo­ple might be able to have a dis­course with some of these algo­rith­mic sys­tems around them, instead of just think­ing of this algo­rithm as this black box that just gives you an answer, you deal with it. What if it was more of a con­ver­sa­tion, for those who actu­al­ly cared to par­tic­i­pate?

And what was the sec­ond?

Matias: You said there were two points that you real­ly want to get across.

Karahalios: Oh, yeah yeah yeah. And so—sorry. And so, like in the Fair Housing Act where they actu­al­ly stip­u­late that they sup­port— Like, I would love for there to be some type of orga­ni­za­tion that also pro­vid­ed fund­ing, because I can’t tell you how expen­sive these audits are, where you got this fund­ing to con­duct them. And maybe you have to prove your­self in some way. Like I actu­al­ly found the data sci­en­tists at Facebook open to dis­cus­sion. Like, the Facebook 1.0 API that we used for a real­ly long time. We’ve done audits with it. They were sup­port­ive of us doing it. They claim that they adapt­ed some of our inter­face inter­nal­ly into Facebook. They tried to keep giv­ing us access to it, but it was not pos­si­ble.

And I don’t know the exact mech­a­nism for this but if there was some way that peo­ple could pro­pose an audit, get fund­ing for it, and do it… And I’m not say­ing— It’s actu­al­ly unfair to say that it’s just aca­d­e­mics. Because it feels like it’s a priv­i­leged posi­tion and the audits that ProPublica has done have been incred­i­ble. And the inves­tiga­tive jour­nal­ism that’s been done has been incred­i­ble. But I do think we need to start think­ing about some chan­nel that also pro­vides fund­ing for these to allow peo­ple to do these audits.

Zuckerman: One thing I real­ly love about that idea (I swear it’ll be the last thing I say, Nathan, and then I’ll shut up) is that we’ve expe­ri­enced the phe­nom­e­non for the last five or six years that peo­ple with very strong tech skills want to find a way not only to make the world bet­ter but to do so using that par­tic­u­lar skill set. And so find­ing some way to sort of har­ness the ener­gy not just of peo­ple in this room, but peo­ple around the world who are con­vinced that their tech­ni­cal skills, their analy­sis skills, could make the world bet­ter, this is a love­ly direc­tion to start push­ing all the peo­ple who are start­ing to get very thought­ful about doing analy­sis of data in a direc­tion that actu­al­ly helps us deal with civ­il rights on a very broad lev­el.

Matias: Excellent. So, I’ll take two ques­tions one right after the oth­er. Then we’ll try to batch them and answer them briefly so we can all have a break as well. So one here from the front. And then…someone on the side.

Audience 1: Hi, I’m Turrell. [sp?] And pri­mar­i­ly on Reddit. But…and this doesn’t address your top­ic so much but address­es cre­at­ing change. And it also address­es the idea of how we deal with these giant cor­po­ra­tions’ sort of black box. The mantra we hear from so many mod­er­a­tors and plat­forms is if you don’t like it, go make your own! But we don’t real­ly sup­port that. And I think that’s some­thing that we can real­ly focus on, is actu­al­ly supporting—whether it’s a non­prof­it or indi­vid­u­als or something—actually being able to start their own com­mu­ni­ties and giv­ing them sort of start­up infor­ma­tion. The data, as well as the sup­port sys­tems? Just in gen­er­al.

Matias: Thank you. And one ques­tion in the very back, on the aisle.

Audience 2: Hi, I’m Biman. [sp?] I’m curi­ous how inves­tiga­tive jour­nal­ism brings for­ward things that Facebook or Instagram might be doing which affect the men­tal mod­el of the user. There was news about how they ran data on a group of users where they might show a pos­i­tive or neg­a­tive news­feed and that might affect their behav­ior. Or how Instagram might with­draw likes from a user to make them keep check­ing the app. My ques­tion is that these things dri­ve their prof­it mod­el and it’s not easy to make them stop doing that? So as a gen­er­al pub­lic user, just rais­ing aware­ness so that users using these apps can take things with a grain of salt and not be affect­ed psy­cho­log­i­cal­ly by what they see, a good idea or how a sense of con­trol can be applied to the mech­a­nisms these com­pa­nies are using with­out think­ing of the gen­er­al men­tal effect it might be caus­ing on the pub­lic.

Matias: Thank you. So, both ques­tions that touch on the kind of busi­ness mod­els and the fund­ing and the sus­tain­abil­i­ty of the gen­er­al ben­e­fits that we gain from the Internet and the chal­lenges cre­at­ed by this cur­rent busi­ness mod­el ecosys­tem.

Zuckerman: Do you want to try first, or?

Karahalios: Sure. I guess I’ll start with a com­ment about your ques­tion first in that I think one of the many ways to address a lit­tle bit about what you’re say­ing is through the design of the inter­face. And one approach that we’ve been look­ing at is this idea of seam­ful expe­ri­ences, where­by you reveal some­thing in the inter­face to actu­al­ly help a user under­stand a lit­tle bit bet­ter. Now, whether or not Facebook would actu­al­ly do this is debat­able. But there are things in the seams that they are start­ing to do which look promis­ing. And the idea of using the inter­face to help edu­cate peo­ple is one that we need to keep look­ing at.

In terms of the psy­chol­o­gy of it, I mean, that’s a long sto­ry that could go on for hours. But as a bridge from one ques­tion to the oth­er, one of the things that I’m real­ly excit­ed about is that now more and more, like if I’m in a Lyft, or if I’m on the street, or if I’m talk­ing to stu­dents, I think it’s incred­i­ble that the con­ver­sa­tions we’re hav­ing are about algo­rithms in the sites that we use every day.

And some­thing that would encour­age more and more peo­ple to talk about this, and then how we could make sense of what they say would be fas­ci­nat­ing. Like I would imag­ine that if peo­ple looked online at what peo­ple were say­ing and there were some aggre­gate visu­al­iza­tion of peoples’…Like you showed in your talk Ethan, that peo­ple might lis­ten if they see more and more peo­ple were dis­cussing this cer­tain top­ic.

I was going to say some­thing else but I for­got so I’m gonna…

Zuckerman: So, answer­ing the ques­tion from our Redditor friend about how we would encour­age peo­ple to build their own plat­forms and move there. I had the priv­i­lege of spend­ing about a year writ­ing a study with Neha Narula and Chelsea Barabas, two real­ly bril­liant schol­ars here, on decen­tral­ized pub­lish­ing sys­tems. So we were real­ly look­ing at this ques­tion of how do you build a decen­tral­ized Reddit? How do you build a decen­tral­ized Twitter?

And my ini­tial instinct was, oh my god that’s real­ly hard. How do you deal with all the data­base issues, name­space issues…? It turns out to be hard but not that hard. Like, tech­ni­cal­ly there’s a bunch of good sys­tems out there that get a long way toward solv­ing them. So what that does is raise the ques­tion of why aren’t these plat­forms thriv­ing?

And the big answer for why these plat­forms aren’t thriv­ing is that peo­ple feel like they can’t exit these exist­ing plat­forms. People have so much invest­ed in these plat­forms, so many rela­tion­ships, so many con­tacts, that sort of shut­ting it off and going cold Turkey just isn’t the answer for most peo­ple.

And so the three of us end­ed up sort of rec­om­mend­ing two things. And they’re pret­ty sim­ple but I actu­al­ly think they’re pret­ty big. One is that you need the right to export from a plat­form. Not only all of your con­tent but also your social graph. You need to be able to take the rela­tion­ships you have and poten­tial­ly move them else­where.

And the sec­ond one is sort of even more sub­tle. You need the right to aggre­gate. You need the right to be able to use a tool that would be able to fol­low you…not only on Twitter but on Mastodon; not only on Facebook but on Diaspora; not only on Reddit but on Steemit; and be able to sort of inte­grate those things so that you can still be involved with a com­mu­ni­ty while you’re involved with a new one.

This prob­a­bly requires a pol­i­cy change. I don’t see Facebook adding the Export From Facebook but­ton. And I’ll tell you Facebook has made it impos­si­ble for us to use Gobo, which is a pro­to­type aggre­ga­tor, to ful­ly engage with Facebook. But if I were going to push for leg­is­la­tion the way that you and and Christian and oth­ers are so won­der­ful­ly push­ing for judi­cial judg­ment, it would be for those two things. Those are the fac­tors that I think would let us have an envi­ron­ment to make it much more like­ly that we could have com­pet­i­tive plat­forms.

Karahalios: And the one thing I would add to that—I com­plete­ly agree, is this…you know, in the ear­ly days of the Internet, I remem­ber in 1993 my advis­er told me to make an HTML page. I had no clue what he was talk­ing about. So I went to the NCSA web site and fig­ured out what it was. We built a cam­era in three days to get a pic­ture of us because we didn’t have a dig­i­tal cam­era. We put it up there.

Today, every­one can have pres­ence online using Facebook. So you know, myself and my col­leagues are try­ing to build tools to make it eas­i­er for peo­ple to put up some of these net­work­ing sites with­out hav­ing to be an expert pro­gram­mer. Like if you can use a basic toolk­it to start your own lit­tle Facebook group. But again, what Ethan says is crit­i­cal. Like, if you’re there alone it’s kin­da use­less. You need to bring your friends with you, and that’s hard. That’s hard.

Matias: Well, thank you so much for shar­ing with us. I think some things that I’m draw­ing from this… First, many of the peo­ple in this room are both in a sit­u­a­tion where we are gov­erned by algo­rithms, but we’re also putting algo­rithms in place. That in today and over this week­end, peo­ple will be dis­cussing how com­mu­ni­ties can deploy their own AI sys­tems to do mod­er­a­tion. And there might be lessons about trans­paren­cy, or audits, or how peo­ple can ques­tion those sys­tems that com­mu­ni­ties can learn.

Another thing is this reminder of the val­ue of build­ing pub­lic inter­est non­prof­it orga­ni­za­tions, which we are part­ly here to think about, that serve these goals.

And final­ly I’m just going to point out that I’m on stage with some­one who just non­cha­lant­ly said, We didn’t have a dig­i­tal cam­era so we built one in three days so that we could par­tic­i­pate on the Web.” So even as we look at unsur­mount­able chal­lenges or at least chal­lenges that seem unsur­mount­able, we need to remem­ber that today many of us have a dig­i­tal cam­era in our pock­et, and many things are pos­si­ble that we can’t even yet imag­ine. So let’s thank these two won­der­ful speak­ers for their par­tic­i­pa­tion.


Help Support Open Transcripts

If you found this useful or interesting, please consider supporting the project monthly at Patreon or once via Square Cash, or even just sharing the link. Thanks.