How an Algorithmic World Can Be Undermined

All they have to do is write to jour­nal­ists and ask ques­tions. And what they do is they ask a jour­nal­ist a ques­tion and be like, What’s going on with this thing?” And jour­nal­ists, under pres­sure to find sto­ries to report, go look­ing around. They imme­di­ate­ly search some­thing in Google. And that becomes the tool of exploita­tion.

Algorithms of Oppression: How Search Engines Reinforce Racism

One of the things that I think is real­ly impor­tant is that we’re pay­ing atten­tion to how we might be able to recu­per­ate and recov­er from these kinds of prac­tices. So rather than think­ing of this as just a tem­po­rary kind of glitch, in fact I’m going to show you sev­er­al of these glitch­es and maybe we might see a pat­tern.

Data & Society Databite #101: Machine Learning: What’s Fair and How Do We Decide?

The ques­tion is what are we doing in the indus­try, or what is the machine learn­ing research com­mu­ni­ty doing, to com­bat instances of algo­rith­mic bias? So I think there is a cer­tain amount of good news, and it’s the good news that I want­ed to focus on in my talk today.

Sleepwalking into Surveillant Capitalism, Sliding into Authoritarianism

We have increas­ing­ly smart, sur­veil­lant per­sua­sion archi­tec­tures. Architectures aimed at per­suad­ing us to do some­thing. At the moment it’s click­ing on an ad. And that seems like a waste. We’re just click­ing on an ad. You know. It’s kind of a waste of our ener­gy. But increas­ing­ly it is going to be per­suad­ing us to sup­port some­thing, to think of some­thing, to imag­ine some­thing.

Forbidden Research: Why We Can’t Do That

Quite often when we’re ask­ing these dif­fi­cult ques­tions we’re ask­ing about ques­tions where we might not even know how to ask where the line is. But in oth­er cas­es, when researchers work to advance pub­lic knowl­edge, even on uncon­tro­ver­sial top­ics, we can still find our­selves for­bid­den from doing the research or dis­sem­i­nat­ing the research.

Forbidden Research Welcome and Introduction: Ethan Zuckerman

As we dug into this top­ic, we real­ized research gets for­bid­den for all sorts of rea­sons. We’re going to talk about top­ics today that are for­bid­den in some sense because they’re so big, they’re so con­se­quen­tial, that it’s extreme­ly dif­fi­cult for any­one to think about who should actu­al­ly have the right to make this deci­sion. We’re going to talk about some top­ics that end up being off the table, that end up being for­bid­den, because they’re kind of icky. They’re real­ly uncom­fort­able. And frankly, if you make it through this day with­out some­thing mak­ing you uncom­fort­able, we did some­thing wrong in plan­ning this event.

Safiya Noble at Biased Data

I often try to tell peo­ple that Google is not pro­vid­ing infor­ma­tion retrieval algo­rithms, it’s pro­vid­ing adver­tis­ing algo­rithms. And that is a very impor­tant dis­tinc­tion when we think about what kind of infor­ma­tion is avail­able in these corporate-controlled spaces.

Marika Cifor at Biased Data

What I’m argu­ing pri­mar­i­ly today is that focus­ing on ped­a­gogy is a key aspect of social jus­tice work, and that teach­ing crit­i­cal data lit­er­a­cy along with oth­er dig­i­tal lit­er­a­cy skills is a key part of what we need to do.