Karrie Karahalios: So in 1963, demon­stra­tors black and white marched on Washington. I was try­ing to explain Martin Luther King Day to my seven-year-old the oth­er day, and it was real­ly real­ly hard. And I was secret­ly— I was pub­licly glad he could not imag­ine a space where dif­fer­ent peo­ple had dif­fer­ent water foun­tains, or had to go to dif­fer­ent schools because of the col­or of their skin. 

In read­ing about this his­tor­i­cal event it was also inter­est­ing to read from his­to­ri­ans that JFK don’t even want to do this this ear­ly. He was hop­ing this would be a sec­ond pres­i­den­tial term type thing, not a first-time thing. But watch­ing this very pub­lic vio­lence unfold in the news­pa­pers, on tele­vi­sion, made him want to end this unrest much much soon­er. And so he actu­al­ly went along and sup­port­ed this protest, to min­i­mize any pos­si­ble vio­lence that might occur if he did not sup­port it. 

By the way, peo­ple were actu­al­ly march­ing for many many many things. Over 200 thou­sand peo­ple showed up. They were march­ing to stop seg­re­ga­tion, for equal­i­ty in hous­ing, and for qual­i­ty in society—jobs in gen­er­al, as you can see from some of these photos. 

Less than a year lat­er, Martin Luther King was dead, JFK was dead, Lyndon Johnson signed this into law and he was very much behind this. This was the Civil Rights Act of 1964, and there were many oth­er Civil Rights Acts before and after. It pro­hib­it­ed dis­crim­i­na­tion based on race, col­or, reli­gion, sex, and nation­al ori­gin by fed­er­al and state gov­ern­ments as well as for some pub­lic spaces. Also, what was inter­est­ing is that this bill actu­al­ly went beyond what was pro­posed in the House and the Senate and it actu­al­ly pro­vid­ed the cut­off of funds for peo­ple who vio­lat­ed some of these laws. 

And as I said, with the Civil Rights Act of 1964, after that came the Fair Housing Act of 1968, which orig­i­nal­ly pro­hib­it­ed dis­crim­i­na­tion based on race, col­or, reli­gion, nation­al ori­gin. In 74 they added sex. In 1988 they added dis­abil­i­ty and famil­ial status. 

And one thing I just want to state, that it’s been hard­er his­tor­i­cal­ly for black women in the world of hous­ing. It turns out that more sin­gle black women owned hous­es than sin­gle black men, and yet cred­it has been hard­er. The Equal Credit Opportunity Act of 1974 made it ille­gal to dis­crim­i­nate around the process of cred­it based on race, col­or, reli­gion, nation­al ori­gin, sex, mar­i­tal sta­tus, or age. 

And then in 1987 we had the Housing and Community Development Act. And that brought on the Department of Housing and Urban Development to enforce the FHA. It fur­ther sup­ports (and this is crit­i­cal to our work) spe­cial projects includ­ing the devel­op­ment of pro­to­types to respond to new or sophis­ti­cat­ed forms of dis­crim­i­na­tion against per­sons protected. 

So what did they do, in terms of try­ing to pro­tect these new forms, to find this dis­crim­i­na­tion? They start­ed with a type of audit. And what they did is they cre­at­ed pairs of enti­ties look­ing for hous­ing and they matched them based on eco­nom­ic sta­tus, famil­ial status…and they had them vis­it a real­tor suc­ces­sive­ly. And what they found was where one per­son was dis­crim­i­nat­ed ver­sus when they were not. The first paired study was done in 1977. The most recent one was done in 2012

And in the most recent one, fol­low­ing the same tra­di­tion­al audit approach, what they found was that blacks were shown 17.7% few­er homes than whites, and told about 17% few­er homes than whites. Asians were shown 18.8% few­er homes than whites, and were told about 15.5% few­er homes than whites. This isn’t the online world but peo­ple phys­i­cal­ly vis­it­ing homes. But imag­ine going online to a hous­ing site. What could be hap­pen­ing under the hood?, and I’m going to get back to that in a second. 

The Fair Housing Act address­es what we see, but it also goes a step fur­ther, to address how we adver­tise to peo­ple. So for exam­ple you could not have an ad that indi­cates any pref­er­ence, lim­i­ta­tion, or dis­crim­i­na­tion based on race, col­or, reli­gion, sex, hand­i­cap, famil­ial sta­tus, nation­al ori­gin, or inten­tion to make any such pref­er­ence, lim­i­ta­tion, or dis­crim­i­na­tion.” So for exam­ple, I can­not put out an ad that says this is ide­al for a white ten­ant,” this is ide­al for a white female tenant.” 

Interestingly enough, we saw that March on Washington, and it’s been over fifty years lat­er and we’re still strug­gling. In fact there was a case in 1991 against The New York Times where it turns out that they had thou­sands and thou­sands of hous­ing ads in their Sunday Times fea­tur­ing white mod­els. Many of these white mod­els depict­ed the rep­re­sen­ta­tive or poten­tial home­own­er. The blacks typ­i­cal­ly rep­re­sent­ed the jan­i­tor, the groundskeep­er, and a door­man, an enter­tain­er. And it turns out that in all of these, it was tra­di­tion­al­ly white fam­i­lies. And so the case was called I believe Reagan ver­sus… Let me find out the exact name of the case before I get it wrong. Reagan ver­sus New York Times 1991. They argued that the repeat­ed and con­tin­ued of depic­tion of white human mod­els and the vir­tu­al absence of any black human mod­els indi­cat­ed a pref­er­ence on the basis of race. 

The New York Times fought this for four years before they set­tled. The Washington Post inci­den­tal­ly changed their poli­cies right away. After that, they basi­cal­ly changed the rep­re­sen­ta­tions of how they showed mod­els in their newspapers. 

So I was real­ly thrilled to see when Rachel Goodman wrote for the ACLU in 2018 that

Thankfully, our exist­ing civ­il rights laws apply no less force­ful­ly when soft­ware, rather than a human deci­sion mak­er, engages in dis­crim­i­na­tion. In fact, equipped as they are with the dis­parate impact frame­work, the Fair Housing Act (FHA), Title VII of the Civil Rights Act of 1964 (Title VII), and the later-enacted Equal Credit Opportunity Act (ECOA), these laws can make clear that twenty-first cen­tu­ry dis­crim­i­na­tion is no less ille­gal than its ana­logue predecessors.
Rachel Goodman, Winter 2018

Now, I was so excit­ed to read this because Rachel is an incred­i­ble lawyer. But also because we’ve been try­ing real­ly hard to fight some of this dis­crim­i­na­tion online and com­ing to barriers. 

So in 2014, with my won­der­ful col­leagues that I feel priv­i­leged to work with, Christian Sandvig, Kevin Hamilton, and Cedric Langbort, we wrote a paper dis­cussing the impor­tance of audit­ing algo­rithms and describ­ing the term men­tion­ing why it’s so impor­tant com­ing up with method­olo­gies for doing this and why. One exam­ple might be fair hous­ing. Another exam­ple might be eco­nom­ic oppor­tu­ni­ties; can an algo­rithm actu­al­ly restrict an oppor­tu­ni­ty to some­body? Might a type of algo­rithm that makes more mon­ey in one way actu­al­ly hurt anoth­er pop­u­la­tion? Do some peo­ple actu­al­ly have advan­tages because of where they live, because of cer­tain algo­rith­mic interfaces? 

And we were work­ing on this because more and more of these trans­ac­tions, whether it be hous­ing, cred­it, employ­ment, for core social good, are hap­pen­ing online and are hap­pen­ing at scale. And many of these imbal­ances are not read­i­ly vis­i­ble to the peo­ple. We don’t see the sep­a­rate water foun­tains. And it’s very very hard to see what’s hap­pen­ing. The cre­ators of these algo­rithms might not even know what’s hap­pen­ing. And they might not even know that injus­tice has occurred. You can­not eas­i­ly com­pare them. 

And so in this paper we describe many dif­fer­ent approach­es to doing these audits. But two of the big tools were scrap­ing— And many star­tups today use scrap­ing to get their com­pa­nies going. And anoth­er tool is the sock­pup­pet. The sock pup­pet, where you cre­ate an iden­ti­ty online. Let’s say I make 200 iden­ti­ties; I make half of them black, half of them white; vary the age ranges; set them out there and see what hap­pens. And Christo Wilson who’s in the audi­ence today has done some amaz­ing work using sock­pup­pets for price dis­crim­i­na­tion and for Uber surge pric­ing. Too many audits to explain in this short talk. 

But what’s inter­est­ing about these stud­ies is that they mim­ic the tra­di­tion­al audit stud­ies. And they’re real­ly hard to do. Facebook is not okay with these, you hav­ing mul­ti­ple iden­ti­ties. Google is more okay with it. But you’ll see lat­er that some terms of ser­vice make it challenging. 

Screenshot of a collection of Airbnb listings

And so by using these tools like the sock­pup­pets, and scrap­ing, and using bots, and using APIs, we can look at a site like this for hous­ing and maybe try to fig­ure out if some dis­crim­i­na­tion is hap­pen­ing. Are these homes pri­or­i­tized dif­fer­ent­ly for dif­fer­ent peo­ple based on their age, on their sex, and so forth? And it’ll help us actu­al­ly under­stand why some of this might be happening.

For exam­ple dis­crim­i­na­tion can occur for many rea­sons. And the sys­tem for exam­ple can cap­ture the bias­es of the humans that use or built it. And even though this study by Ben Edelman— He’s done some amaz­ing stud­ies, by the way. (Sorry to use the term amaz­ing so much, like Trump.) But he stud­ied racial dis­crim­i­na­tion on Airbnb. And he found that if you had a dis­tinc­tive­ly African-American name, you’re 16% less like­ly to be accept­ed for a home rel­a­tive to iden­ti­cal guests with dis­tinc­tive­ly white names. 

He also did an inter­est­ing study where if you’re a black land­lord you make less mon­ey than if you’re a white land­lord if you con­trol for oth­er fac­tors for the home. And dis­crim­i­na­tion occurred across all land­lords, ones shar­ing the prop­er­ties and larg­er land­lords with more prop­er­ties. And this sug­gest­ed that Airbnb’s cur­rent design choices—and I’m a com­put­er sci­en­tist and a design­er in human-computer inter­ac­tion. And the design choic­es actu­al­ly can facil­i­tate dis­crim­i­na­tion and raise the pos­si­bil­i­ty of eras­ing some of these civ­il rights gains that peo­ple have fought so hard for.

So for exam­ple some of things they pro­posed are not hav­ing peo­ple’s pic­tures in here, hav­ing sta­tis­tics up front and then get­ting into the pic­tures lat­er. Or sug­ges­tions to remove the pic­tures alto­geth­er. But it turns out users hat­ed that so they put pic­tures back in to sat­is­fy the users. And so here, they were also work­ing on inter­faces to change some of these sites. And again, while Edelman’s stud­ies have not looked at the algo­rithm behind it, they looked at a lot of the inputs.

And there’s oth­er places to look at inputs as well. For exam­ple here. ProPublica found in October of 2016 that Facebook let adver­tis­ers exclude users by race. Now, any­one who could’ve placed an ad on this using many sim­i­lar sites like Google or Facebook could have known this. But it turns out that peo­ple did­n’t dis­cuss this idea of what it means to use eth­nic affin­i­ty as a tar­get vari­able. And while it’s per­fect­ly legal to tar­get men’s cloth­ing to men, it’s ille­gal to adver­tise high-paying jobs exclu­sive­ly to that same group. 

And this is not new in adver­tis­ing. It turns out that tra­di­tion­al­ly because of these laws, adver­tis­ers have tried to use ZIP code as a proxy, or area code as a proxy, to tar­get spe­cif­ic groups. And you don’t know what might be hap­pen­ing there, espe­cial­ly when you start get­ting into the machine learn­ing, using machine learn­ing algorithms—specifically deep learn­ing algo­rithms that use neur­al nets—where you might have a vari­able that forms along the way that actu­al­ly can cap­ture some these char­ac­ter­i­za­tions based on oth­er data that you have there. In fact many of the design­ers don’t even know what’s going on. Some of these lay­ers are now thou­sands and thou­sands of lay­ers deep. 

And in talk­ing to many devel­op­ers and engi­neers of some of these ad tar­get­ing sys­tems, I men­tioned to them that maybe this should­n’t be hap­pen­ing, and they looked at me appalled like, Karrie, we need to use all of the data to get the best pos­si­ble match.” Many of the devel­op­ers are not famil­iar with the law. Many of the devel­op­ers went on to fur­ther counter that per­haps peo­ple who are black want to have this type of house and know­ing that infor­ma­tion could make it bet­ter. So, there’s so many lev­els deep here of hav­ing to inform an algo­rithm lit­er­a­cy, and I’m going to get to that in a bit. 

And so because of this Facebook put out a pub­lic release that they were appalled. They did not want to help fur­ther his­tor­i­cal oppres­sion. And they want­ed to par­tic­u­lar­ly in the areas of hous­ing, employ­ment and cred­it, where cer­tain groups were his­tor­i­cal­ly fac­ing dis­crim­i­na­tion, do some­thing about this. And they claimed they went back, they mod­i­fied their sys­tem, so this should not be hap­pen­ing on Facebook any­more. This was February 2017

November of 2017, ProPublica went back and said Facebook still is let­ting hous­ing adver­tis­ers exclude users by race. This is call­ing for more reg­u­la­tion. At some lev­els this is an inter­face issue. Also it’s becom­ing an algo­rithm issue, because one of the defens­es against not being able to cap­ture all of these is it’s hard to know what is hous­ing and what is not. Would an algo­rithm be used to deter­mine what is a hous­ing ad ver­sus what is an employ­ment ad and so forth. And it’s just going to keep get­ting much much more com­plex when these char­ac­ter­is­tics are inferred from oth­er data, as I men­tioned with some of these neur­al nets and oth­er deep learn­ing systems. 

Moving more towards algo­rithms, Latanya Sweeney’s sem­i­nal study look­ing at algo­rithm dis­crim­i­na­tion based on race clear­ly found that by putting in a white name you got dif­fer­ent ads than if you put in a char­ac­ter­is­ti­cal­ly black name. And this was for many rea­sons. There’s a bias could be intro­duced by what peo­ple are click­ing on this plat­form. And that’s some­thing to keep think­ing about. A sys­tem should actu­al­ly be able to under­stand that users’ bias­es might be cap­tured here. And there are no easy solu­tions to this. Biases are cap­tured in so many dif­fer­ent ways. But we need to be aware of them and these audits are so very important. 

Another exam­ple that we touched on ear­li­er and it final­ly came to pass, it turns out women were being shown ads for lower-paying jobs than men were being shown on Facebook. But again, a lot of this could be based on what peo­ple click. A lot of this could be based on what peo­ple are tar­get­ed towards. 

But there’s also anoth­er ele­ment of bias that comes in that peo­ple did­n’t think about as recent­ly as 2016. How many of you have seen the Beauty​.AI project, out of curios­i­ty? A few of you. This was sup­posed to be the first-ever unbi­ased, objec­tive beau­ty con­test, that was just sup­posed to look at fea­tures of your face. It employed deep learn­ing. It employed neur­al nets with many many dif­fer­ent lay­ers. It was spon­sored by many respectable com­mu­ni­ties, got lots of mon­ey. All over the place. Over 6,000 peo­ple sub­mit­ted pho­tos to this. 

Of the peo­ple that sub­mit­ted pho­tos from all over the world, out of the forty-four win­ners near­ly all were white. There were four Asians, I believe. Only one had dark skin. And many many peo­ple were— These pho­tos like I said were sent from all over the world. 

There were a num­ber of rea­sons for why the algo­rithm favored white peo­ple. The main prob­lem, and admit­ted by the devel­op­ers, was that the data the project used estab­lish stan­dards of attrac­tive­ness, their train­ing set, did not have black faces. So for exam­ple, how can you have an objec­tive algo­rithm if your train­ing set is biased? While the algo­rithm had no rule in it to actu­al­ly treat a dif­fer­ent skin col­or dif­fer­ent­ly, the train­ing set it used steered it and taugh it that white faces were more attrac­tive, despite using many of these oth­er para­me­ters as well. The devel­op­ers of this tool stat­ed this and were like, We kin­da for­got that part in our algo­rithm.” It’s not just the algo­rithm, the train­ing set mat­ters as well. 

And this is an exam­ple I know I don’t need to tell to most peo­ple in the room. We can’t cap­ture all forms of bias. But a first impor­tant step is to audit these sys­tems. And ProPublica has been a pub­lic leader in this, and I’m so excit­ed and I can’t wait to see more of what they do. They pub­lished this arti­cle in May of 2016 look­ing at recidi­vism rates. They did not have— We do not have access to a lot of these algo­rithms. They’re black box­es to us. We can­not see inside of them. And that’s why it’s so impor­tant study them from the outside. 

And what they found, that this recidi­vism score, the like­li­hood that peo­ple were to com­mit a crime again, was false­ly flagged for black defen­dants more­so than white defen­dants. In fact white defen­dants were mis­la­beled as low risk more than black defen­dants. Such sys­tems, for those of you don’t know, are in use in many states today. They dic­tate who can stay home before a tri­al. In some cas­es how long your sen­tence should be. And even though after inter­view­ing the design­ers of this sys­tem, they claimed they did not design to be used in this way. Systems are being used in this way. 

And so, doing years and years of these audits we got real­ly excit­ed in May of 2016, when the White House released this press release on big data. And they named five nation­al pri­or­i­ties that are essen­tial for the devel­op­ment of big data tech­nolo­gies. And one was algo­rithm audit­ing, and they cit­ed our paper, and we were doing som­er­saults. Many oth­er things they looked at were an ethics coun­cil, an appeals process, algo­rithm lit­er­a­cy, and so forth. And in their doc­u­ment they want­ed to pro­mote aca­d­e­m­ic research and indus­try devel­op­ment of algo­rith­mic audit­ing and exter­nal test­ing of big data sys­tems to ensure that peo­ple are being treat­ed fairly.” 

Again, we were so excit­ed by this. And espe­cial­ly to keep mov­ing with some of the algo­rithms that we had in place. But then we got stuck. Because a lot of what we’re doing, such as web scrap­ing and the cre­ation of the sock­pup­pets, it turns out that many courts and fed­er­al pros­e­cu­tors have inter­pret­ed the law to make it a crime to vis­it a web site in a man­ner that vio­lates terms of ser­vice or terms of use estab­lished by a web site. 

We did not want to be crim­i­nals. I do not want to ask my stu­dents to do some­thing that might be a fed­er­al crime. And yet we still think that it’s so impor­tant to get some of this work out. This caus­es a lot of con­cern, caus­es a lot of fear, and in many cas­es it actu­al­ly closed paths of research and clos­es avenues avail­able to us to do this work and to do it much more cheap­ly. Despite the fact that we had no intent to cause mate­r­i­al harm or to tar­get web sites’ oper­a­tions, and have no intent to com­mit fraud or to access any data infor­ma­tion that is not public. 

Just to give you a hint, for those of you not too famil­iar with some terms of ser­vice rules, one thing that I want to state is that the scrap­ing and some of these vio­la­tions of terms of ser­vice, since Ethan men­tioned norms, are a norm in com­put­er sci­ence. Nonetheless, our research group received legal advice from top lawyers rec­om­mend­ing that such scrap­ing was almost cer­tain­ly ille­gal under CFAA. And we were told that if ever we received a cease and desist let­ter, that we should stop. If we ever had our IP addresses—and they’re blocked almost all the time—we should stop. And they also con­sci­en­tious­ly asked us to do back of the enve­lope cal­cu­la­tions on what load we were putting on the web sites. And if the load was too high, then just to stop. In anoth­er project a while back, a lawyer so much as told me not to read any terms of ser­vice so that I could have plau­si­ble deni­a­bil­i­ty in case I were ever contacted. 

But just some exam­ples of terms of ser­vice. Automated col­lec­tion is for­bid­den. So this is Facebook on top. I can­not use auto­mat­ed means (such as har­vest­ing bots, robot, spi­ders, or scrap­ers) with­out pri­or per­mis­sion.” Furthermore, I will not facil­i­tate or encour­age any vio­la­tions of this state­ment or the policies.” 

Pokémon GO: you can­not extract, scrape, index, copy, or mir­ror the ser­vices or con­tent.” This gets at our scrap­ing tool. 

In terms of iden­ti­fiers or sock­pup­pets, it turns out I can­not manip­u­late iden­ti­fiers in order to dis­guise the ori­gin of any con­tent trans­mit­ted through the Twitch ser­vice,” or imper­son­ate or mis­rep­re­sent my affil­i­a­tion with anoth­er per­son or entity.” 

Furthermore, in some cas­es they go so far as to say that I can­not use a scraper, auto­mat­ed means, method­ol­o­gy, algo­rithm or device or any man­u­al process for any pur­pose.” So does that mean I can­not write some­thing on a piece of paper? And so Ethan, while I’m a big fan of the law, I don’t think it’s very clean. And I would love to dis­cuss that further.

In this case I can­not reverse engi­neer, some­thing that many com­put­er sci­en­tists do all the time with black box sys­tems. In [fact] it’s encour­aged in the secu­ri­ty com­mu­ni­ty. You can­not do secu­ri­ty research unless you do some of this. 

Also, terms of ser­vices change. This app reserves the right to update and mod­i­fy terms of ser­vice at any time with­out notice. I was once work­ing with a com­pa­ny for six months, and when I start­ed the project they were so excit­ed. They’re like, This is great work. We sup­port you. This is amaz­ing.” A year lat­er the same peo­ple told us, Um, you’re now vio­lat­ing terms of ser­vice.” The excite­ment had turned to less excite­ment, and expressed caution. 

Furthermore, in some cas­es you actu­al­ly can waive a right to a tri­al. Here you agree that dis­putes between you and Niantic will be resolved by bind­ing, indi­vid­ual arbi­tra­tion, and you are wav­ing your right to a tri­al by jury or to par­tic­i­pate as a plain­tiff or class mem­ber” in any lat­er proceeding. 

In this case, one lawyer actu­al­ly told me that I should not pub­lish a paper I was doing because not only can I not copy or scrape, but I can­not pub­lish or pro­mote and pub­lish” might mean pub­lish­ing a research paper. 

And so you know, there’s oth­er inter­est­ing obsta­cles that keep com­ing into here. And so this actu­al­ly… The terms of ser­vice are often couched under the Computer Fraud and Abuse Act, which crim­i­nal­ized cer­tain com­put­er acts. And cer­tain uses of the CFAA, or the Computer Fraud and Abuse Act, have been denounced by researchers because of our norms. Because we could all go to jail for secu­ri­ty research or dis­crim­i­na­tion research at any moment and a jury could hap­pi­ly con­vict us. 

So what is hap­pen­ing in this law? Basically, it pro­hibits unau­tho­rized access to pro­tect­ed com­put­ers under cer­tain cir­cum­stances. But the key that’s inter­est­ing here for us is that who­ev­er inten­tion­al­ly access­es a com­put­er with­out autho­riza­tion or exceeds autho­rized access, and there­by obtains some­thing…” It this exceeds autho­rized access” phrase which was con­fus­ing. And it’s been repeat­ed­ly inter­pret­ed by courts in fed­er­al gov­ern­ment to pro­hib­it access­ing a publicly-available web site in a man­ner that vio­lates the web site’s terms of ser­vice. If you do, the first vio­la­tion car­ries a one-year max­i­mum prison sen­tence and fine. Second or sub­se­quent vio­la­tion car­ries a prison sen­tence up to ten years and fine. And intent to cause harm is not considered. 

And so, with some col­leagues, Christian Sandvig, Alan Mislove, and Christo Wilson, who’s in this room, we filed a law­suit with the amaz­ing lawyers of the ACLU basi­cal­ly try­ing to under­stand the broad and ambigu­ous nature of this law that pro­hibits and chills a range of speech and expres­sive activ­i­ty that is pro­tect­ed. It pro­tects indi­vid­u­als from con­duct­ing robust research on issues of pub­lic con­cern when web­sites choose to for­bid such activ­i­ty. It could also vio­late the First Amendment and due process clause of the Fifth Amendment to the US Constitution. 

We filed this law­suit in June 29 of 2016. In September 9 of 2016, the gov­ern­ment filed a motion to dis­miss. One month after that, they released a pre­vi­ous­ly non-public doc­u­ment that they had made in 2014 sug­gest­ing their pol­i­cy for when the tar­get CFAA crimes. And the exam­ple of some­thing they dis­cussed in this doc­u­ment is that one would not get pros­e­cut­ed for lying about their weight on a dat­ing site, even though that vio­lates terms of service.

However, in the link for this they also say the prin­ci­ples set forth here, and inter­nal office pro­ce­dures adopt­ed pur­suant to this mem­o­ran­dum,” are sole­ly guide­lines. They do not pro­tect any­one, specif­i­cal­ly researchers. And that was our goal here. I’m not a lawyer. After this process I real­ly wish I was a lawyer. If you want to know details of the case, please look at the web site. The lawyers have put up all the pub­lic doc­u­men­ta­tion for it. 

And in clos­ing, I want to say that this has not stopped us from build­ing these tools and I think we should all keep build­ing these tools. Similar to the images in the begin­ning where a bunch of peo­ple came togeth­er col­lec­tive­ly, one of our big focus­es at the moment is build­ing col­lec­tive audits. This idea where peo­ple come togeth­er and actu­al­ly reap­pro­pri­ate a site—for exam­ple we explored a hotel book­ing site and found that they biased low-rating to middle-rating hotels upwards by 25%. 

And so what we also found was that by look­ing at the con­ver­sa­tion peo­ple were hav­ing, we weren’t the first ones to find this. The peo­ple, you had found this bias before us. And by look­ing at the con­ver­sa­tions on there, you helped us find it and make it pub­lic. And not only that but peo­ple were reap­pro­pri­at­ing this site and say­ing, This is what the site says my rat­ing is, but this is what it should be.” And they were actu­al­ly mak­ing it their own plat­form as opposed to what it was before. 

So we also need geo­graph­i­cal tools to address these so that what hap­pened with Amazon Prime’s site does not hap­pen today. We need tools that actu­al­ly let peo­ple be able to inter­ro­gate their own data, which they can­not do today. And we want peo­ple to be able to col­lect data soci­etal­ly, to help us make these col­lec­tive audits. And so I’m gonna end there and say that we want peo­ple to come togeth­er not just to protest but to share and to have a voice. And thank you. And I also want to thank the ACLU for help­ing us with all of this. And all the stu­dents involved. All the fac­ul­ty involved that I’ve worked with. And the com­mu­ni­ty at large like the MIT Media Lab, like the Berkman Center, that have actu­al­ly been very very sup­port­ive of this work.