Archive

danah boyd: Algorithmic Accountability and Transparency

In the next ten years we will see data-driven tech­nolo­gies recon­fig­ure sys­tems in many dif­fer­ent sec­tors, from autonomous vehi­cles to per­son­al­ized learn­ing, pre­dic­tive polic­ing, to pre­ci­sion med­i­cine. While the advances that we will see will cre­ate phe­nom­e­nal new oppor­tu­ni­ties, they will also cre­ate new challenges—and new worries—and it behooves us to start grap­pling with these issues now so that we can build healthy sociotech­ni­cal systems.

How an Algorithmic World Can Be Undermined

All they have to do is write to jour­nal­ists and ask ques­tions. And what they do is they ask a jour­nal­ist a ques­tion and be like, What’s going on with this thing?” And jour­nal­ists, under pres­sure to find sto­ries to report, go look­ing around. They imme­di­ate­ly search some­thing in Google. And that becomes the tool of exploitation.

You Are Not a Digital Native (and that’s OK)

You may have heard peo­ple come up to you and say like, Hey, you’re young. That makes you a dig­i­tal native.” Something about being born after the mil­len­ni­um or born after 1995 or what­ev­er, that makes you sort of mys­ti­cal­ly tuned in to what the Internet is for, and any­thing that you do on the Internet must be what the Internet is actu­al­ly for. And I’m here to tell you that you’re not a dig­i­tal native. That you’re just some­one who uses com­put­ers, and you’re no bet­ter and no worse than the rest of us at using computers.

Social and Ethical Challenges of AI

One of the chal­lenges of build­ing new tech­nolo­gies is that we often want them to solve things that have been very social­ly dif­fi­cult to solve. Things that we don’t have answers to, prob­lems that we don’t know how we would best go about it in a social­ly respon­si­ble way. 

Malia Lazu, Black Reality 2.0: Creating and Making in the Digital Age

I became tired of knock­ing on the same doors and either see­ing the same peo­ple or dif­fer­ent peo­ple. But I real­ly just felt like I was in this cycle of faux lib­er­a­tion, where I would feel a vic­to­ry, and the vic­to­ry was prob­a­bly formed around the RFP for the grant that we need­ed to get in order to do our work.

Data & Society Databite #41: Ifeoma Ajunwa on Genetic Coercion

The mythol­o­gy of genet­ic coer­cion is thoughts that genet­ic data, espe­cial­ly large-scale genet­ic data­bas­es, have the abil­i­ty to pin­point cer­tain risk of dis­ease. They pro­vide agency to act to pre­vent such dis­ease, and it can be used to cre­ate accu­rate per­son­al­ized treat­ment for dis­ease, and it should also be entrust­ed with the author­i­ty to dic­tate the mod­i­fi­ca­tion of the genome for future generations. 

Biased Data Panel Discussion

I think that we need a rad­i­cal design change. And I might ask if I were teach­ing an HCI class or design class with you, I would say, How are you going to design this so that not one life is lost?” What if that were the design imper­a­tive rather than what’s your IPO going to be?