Christo Wilson: Hi! Good evening. So I’m Christo Wilson. I’m a com­put­er sci­ence pro­fes­sor over at Northeastern. And like Karrie I con­sid­er myself to be an algo­rithm audi­tor. So what does that mean? Well, I’m inher­ent­ly a sus­pi­cious per­son. When I start inter­act­ing with a new ser­vice, or a new app, and it appears to be doing some­thing dynam­ic, I imme­di­ate­ly begin to ques­tion what is going on inside the black box, right? What is pow­er­ing these dynam­ics? And ulti­mate­ly what is the impact of this? Is it chang­ing peo­ple’s per­cep­tions? Is infor­ma­tion being hid­den? What is going on here?

So to give you a lit­tle fla­vor for how this work plays out, I want to talk about one of my favorite audits that we con­duct­ed, where we looked at surge pric­ing on Uber. Now of course, Uber is a good, upstand­ing cor­po­rate cit­i­zen, right. There’s no rea­son to sus­pect that they might be tin­ker­ing with the prices just to charge you more mon­ey. But nonethe­less, we were curi­ous how these prices got cal­cu­lat­ed. So we signed up for dozens of Uber accounts. We spoofed their GPS coor­di­nates, to place them in a grid through­out a met­ro­pol­i­tan area. And that enabled us to see all the cars that were dri­ving around that were avail­able. When those cars dis­ap­peared, that implied that they got booked and we could see all the pre­vail­ing prices. 

So, the good thing that we found from this is that indeed, what we saw was that the surge prices strong­ly cor­re­lat­ed with sup­ply and demand for vehi­cles. And that match­es our expec­ta­tions for how we think these num­bers should be cal­cu­lat­ed. So that’s good.

The bad thing was that we also found that Uber was ran­dom­ly giv­ing users incor­rect prices for about sev­en months. And they did­n’t real­ize that until we got in touch with them and talked to their engineers. 

Screenshot of Uber's transparency blog post, "Peeking Under the Hoot at Uber," which is nearly identical to the title of Wilson's research paper

Another pos­i­tive out­come of our audit was that Uber opened a trans­paren­cy blog the day our study got pub­lished. So this is not you know, full trans­paren­cy, right. It’s just crack­ing the win­dow a lit­tle bit. But some­thing is bet­ter than noth­ing. I just wish that they had cho­sen a dif­fer­ent title for their blog. You know, emu­la­tion is the sin­cer­est form of flat­tery, but…you know, what­ev­er, it’s fine, right. 

So actu­al­ly what I would like to talk about in a lit­tle bit more detail is some of our ongo­ing work look­ing at Google search. It feels like social media has sort of got­ten most of the blow­back from the fake news and the mis­in­for­ma­tion deba­cle. But real­ly, Google search remains the pri­ma­ry lens through which peo­ple con­sume and locate con­tent on the Web—much more so even than Facebook to this day. So under­stand­ing how Google search works, how it presents con­tent to users, is incred­i­bly important. 

One of my PhD stu­dents has been run­ning in-lab exper­i­ments where we bring peo­ple in and we show them search results that are heav­i­ly biased, right. And then we sur­vey them before and after to see if their polit­i­cal beliefs or their per­cep­tions of the can­di­dates have changed. 

And the shock­ing thing is that when you show peo­ple heavily-biased search results it can have a very large impact on their per­cep­tions of the can­di­dates. Large enough to even swing peo­ple’s vot­ing posi­tions if they were sort of on the fence to begin with. 

Now, this an in-lab exper­i­ment. And I’m not imply­ing that Google is doing any­thing like this. The prob­lem is that we just don’t know, right. We need to go out and mea­sure Google to see what infor­ma­tion they’re actu­al­ly pre­sent­ing to peo­ple. And that itself, though, is a chal­lenge. I can go write a scraper that runs auto­mat­ed Google search­es and col­lects data, but that’s a fun­da­men­tal­ly incom­plete pic­ture of what’s com­ing out of the sys­tem. We all know that Google search is heav­i­ly personalized. 

In this case, these results are dif­fer­ent because of loca­tion. But there’s oth­er fac­tors like what’s in your Gmail inbox; what have you searched for and clicked on late­ly; have you inter­act­ed with adver­tise­ments; that could all poten­tial­ly impact the out­put of the search engine. So to real­ly get an under­stand­ing of how this sys­tem works, the kind of information—especially polit­i­cal information—that it’s dis­play­ing to users, we need to enlist your help.

So, in the next cou­ple of months we’re going to be rolling out it’s what we call a col­lab­o­ra­tive audit. So this is a brows­er exten­sion that we’re going to try to get peo­ple to install that allows us to essen­tial­ly bor­row your cook­ies. You install it and that gives us the abil­i­ty to run search­es in your browser. 

Now, we’re not going to be col­lect­ing your search­es, and your search his­to­ry, right. That’s a pri­va­cy vio­la­tion, and super creepy. I just want to bor­row your brows­er so I can run some polit­i­cal search­es to see what would Google have shown you giv­en the infor­ma­tion they have about you, ver­sus what they would show to me, or Nathan, or any­one else. This kind of col­lab­o­ra­tive audit gives us the abil­i­ty to get a sort of broad-ranging view of how the sys­tem func­tions in the real world, track its behav­ior over time, and ulti­mate­ly, the next time there’s some kind of crazy fake news con­tro­ver­sy, we can look ret­ro­spec­tive­ly and see how did this hap­pen on Google? Who was see­ing it? How preva­lent is it? What is going on? Thank you very much.