Archive

Data & Society Databite #101: Machine Learning: What’s Fair and How Do We Decide?

The ques­tion is what are we doing in the indus­try, or what is the machine learn­ing research com­mu­ni­ty doing, to com­bat instances of algo­rith­mic bias? So I think there is a cer­tain amount of good news, and it’s the good news that I want­ed to focus on in my talk today. 

FollowBias: Supporting Behavior Change Toward Gender Equality on Social Media

In 2011, the cul­tur­al crit­ic Emily Nussbaum reflect­ed on the flow­er­ing of online fem­i­nism through new pub­li­ca­tions, social media con­ver­sa­tions, and dig­i­tal orga­niz­ing. But Nussbaum wor­ried, even if you can expand the sup­ply of who’s writ­ing, will that actu­al­ly change the influ­ence of women’s voic­es in soci­ety? What if online fem­i­nism was just an echo chamber?

Your Body is a Honeypot
Loving Out Loud When There’s No Place to Hide

We have to ask who’s cre­at­ing this tech­nol­o­gy and who ben­e­fits from it. Who should have the right to col­lect and use infor­ma­tion about our faces and our bod­ies? What are the mech­a­nisms of con­trol? We have gov­ern­ment con­trol on the one hand, cap­i­tal­ism on the oth­er hand, and this murky grey zone between who’s build­ing the tech­nol­o­gy, who’s cap­tur­ing, and who’s ben­e­fit­ing from it.

Sleepwalking into Surveillant Capitalism, Sliding into Authoritarianism

We have increas­ing­ly smart, sur­veil­lant per­sua­sion archi­tec­tures. Architectures aimed at per­suad­ing us to do some­thing. At the moment it’s click­ing on an ad. And that seems like a waste. We’re just click­ing on an ad. You know. It’s kind of a waste of our ener­gy. But increas­ing­ly it is going to be per­suad­ing us to sup­port some­thing, to think of some­thing, to imag­ine something.

Opal Tometi Keynote, Marching in the Arc of Justice Conference

It’s para­mount that our soci­ety rec­og­nize the role of anti-black struc­tur­al racism in the US. And that our 21st cen­tu­ry mul­tira­cial social move­ments uplift and cen­tral­ize the issues of those com­mu­ni­ty mem­bers who are impact­ed and are liv­ing at the mar­gins. We know that if we do, we’ll get clos­er to real jus­tice for all of us. Moreover, it’s been wide­ly doc­u­ment­ed that the gains made by and with the black com­mu­ni­ty have always led to bet­ter stan­dards of liv­ing for all of us.

Programming is Forgetting: Toward a New Hacker Ethic

I wouldn’t be sur­prised to find out that many of us here today like to see our work as a con­tin­u­a­tion of say the Tech Model Railroad Club or the Homebrew Computer Club, and cer­tain­ly the ter­mi­nol­o­gy and the val­ues of this con­fer­ence, like open source for exam­ple, have their roots in that era. As a con­se­quence it’s easy to inter­pret any crit­i­cism of the hack­er ethic—which is what I’m about to do—as a kind of assault.

Marika Cifor at Biased Data

What I’m argu­ing pri­mar­i­ly today is that focus­ing on ped­a­gogy is a key aspect of social jus­tice work, and that teach­ing crit­i­cal data lit­er­a­cy along with oth­er dig­i­tal lit­er­a­cy skills is a key part of what we need to do.

Biased Data Panel Discussion

I think that we need a rad­i­cal design change. And I might ask if I were teach­ing an HCI class or design class with you, I would say, How are you going to design this so that not one life is lost?” What if that were the design imper­a­tive rather than what’s your IPO going to be?

Powered by WordPress & Theme by Anders Norén