Archive

Watch Your Words

The premise of our project is real­ly that we are sur­round­ed by machines that are read­ing what we write, and judg­ing us based on what­ev­er they think we’re saying. 

Compassion through Computation: Fighting Algorithmic Bias

I think the ques­tion I’m try­ing to for­mu­late is, how in this world of increas­ing opti­miza­tion where the algo­rithms will be accu­rate… They’ll increas­ing­ly be accu­rate. But their appli­ca­tion could lead to dis­crim­i­na­tion. How do we stop that?

Auditing Algorithms

I con­sid­er myself to be an algo­rithm audi­tor. So what does that mean? Well, I’m inher­ent­ly a sus­pi­cious per­son. When I start inter­act­ing with a new ser­vice, or a new app, and it appears to be doing some­thing dynam­ic, I imme­di­ate­ly been begin to ques­tion what is going on inside the black box, right? What is pow­er­ing these dynam­ics? And ulti­mate­ly what is the impact of this?

Data & Society Databite #101: Machine Learning: What’s Fair and How Do We Decide?

The ques­tion is what are we doing in the indus­try, or what is the machine learn­ing research com­mu­ni­ty doing, to com­bat instances of algo­rith­mic bias? So I think there is a cer­tain amount of good news, and it’s the good news that I want­ed to focus on in my talk today. 

Forbidden Research: Why We Can’t Do That

Quite often when we’re ask­ing these dif­fi­cult ques­tions we’re ask­ing about ques­tions where we might not even know how to ask where the line is. But in oth­er cas­es, when researchers work to advance pub­lic knowl­edge, even on uncon­tro­ver­sial top­ics, we can still find our­selves for­bid­den from doing the research or dis­sem­i­nat­ing the research.