Archive (Page 1 of 2)

Watch Your Words

The premise of our project is real­ly that we are sur­round­ed by machines that are read­ing what we write, and judg­ing us based on what­ev­er they think we’re say­ing.

Kaleidoscope: Positionality-aware Machine Learning

Positionality is the spe­cif­ic posi­tion or per­spec­tive that an indi­vid­ual takes giv­en their past expe­ri­ences, their knowl­edge; their world­view is shaped by posi­tion­al­i­ty. It’s a unique but par­tial view of the world. And when we’re design­ing machines we’re embed­ding posi­tion­al­i­ty into those machines with all of the choic­es we’re mak­ing about what counts and what does­n’t count.

AI Blindspot

AI Blindspot is a dis­cov­ery process for spot­ting uncon­scious bias­es and struc­tur­al inequal­i­ties in AI sys­tems.

Compassion through Computation: Fighting Algorithmic Bias

I think the ques­tion I’m try­ing to for­mu­late is, how in this world of increas­ing opti­miza­tion where the algo­rithms will be accu­rate… They’ll increas­ing­ly be accu­rate. But their appli­ca­tion could lead to dis­crim­i­na­tion. How do we stop that?

How an Algorithmic World Can Be Undermined

All they have to do is write to jour­nal­ists and ask ques­tions. And what they do is they ask a jour­nal­ist a ques­tion and be like, What’s going on with this thing?” And jour­nal­ists, under pres­sure to find sto­ries to report, go look­ing around. They imme­di­ate­ly search some­thing in Google. And that becomes the tool of exploita­tion.

Algorithms of Oppression: How Search Engines Reinforce Racism

One of the things that I think is real­ly impor­tant is that we’re pay­ing atten­tion to how we might be able to recu­per­ate and recov­er from these kinds of prac­tices. So rather than think­ing of this as just a tem­po­rary kind of glitch, in fact I’m going to show you sev­er­al of these glitch­es and maybe we might see a pat­tern.

Biases Abound

I’ve expe­ri­enced first hand the chal­lenges of try­ing to cor­rect mis­in­for­ma­tion, and in part my aca­d­e­m­ic research builds on that expe­ri­ence and tries under­stand why it was that so much of what we did at Spinsanity antag­o­nized even those peo­ple who were inter­est­ed enough to go to a fact-checking web site.

The Science of Why We Deny Science and Reality

What is it about our brains that makes facts so chal­leng­ing, so odd and threat­en­ing? Why do we some­times dou­ble down on false beliefs? And maybe why do some of us do it more than oth­ers?

Data & Society Databite #101: Machine Learning: What’s Fair and How Do We Decide?

The ques­tion is what are we doing in the indus­try, or what is the machine learn­ing research com­mu­ni­ty doing, to com­bat instances of algo­rith­mic bias? So I think there is a cer­tain amount of good news, and it’s the good news that I want­ed to focus on in my talk today.

FollowBias: Supporting Behavior Change Toward Gender Equality on Social Media

In 2011, the cul­tur­al crit­ic Emily Nussbaum reflect­ed on the flow­er­ing of online fem­i­nism through new pub­li­ca­tions, social media con­ver­sa­tions, and dig­i­tal orga­niz­ing. But Nussbaum wor­ried, even if you can expand the sup­ply of who’s writ­ing, will that actu­al­ly change the influ­ence of wom­en’s voic­es in soci­ety? What if online fem­i­nism was just an echo cham­ber?

Page 1 of 2