Zarah Rahman: I’ll be talk­ing about the impor­tance of build­ing bridges between researchers who are think­ing crit­i­cal­ly about tech­nol­o­gy and data, and civ­il soci­ety. So those work­ing for pos­i­tive social change. 

So I came to this fel­low­ship want­i­ng to explore the idea of tech trans­la­tion, which is a term that a col­league and friend of mine Lucy Chambers coined a cou­ple of years ago. I was using it very specif­i­cal­ly to explain this obser­va­tion I kept see­ing that peo­ple who are work­ing in tech­nol­o­gy who had a real­ly deep under­stand­ing of tech­nol­o­gy were effec­tive­ly speak­ing almost a dif­fer­ent lan­guage to those out­side of the tech sector.

I spent some­time in 2015 col­lect­ing a series of case stud­ies around how civ­il soci­ety orga­ni­za­tions… The chal­lenges that they were fac­ing when they were try­ing to engage with tech­nol­o­gy and data. And these ranged from not know­ing how to man­age data that they were col­lect­ing about vul­ner­a­ble com­mu­ni­ties, to using resources to build an app that end­ed up to be kind of useless.

So, some con­text. I work for a non­prof­it orga­ni­za­tion called The Engine Room. Our mis­sion is to sup­port civ­il soci­ety to use tech­nol­o­gy and data and more effec­tive­ly and strate­gi­cal­ly in their work. And when I say civ­il soci­ety I mean jour­nal­ists, media activists, com­mu­ni­ty orga­niz­ers, non­prof­it orga­ni­za­tions. Some con­text about the sit­u­a­tions in which we’re work­ing, it’s often very resource-limited. There’s not that much mon­ey. There’s a lot of pres­sure. And as you can imag­ine right now, it feels like the prob­lems that we’re try­ing to address are grow­ing even bigger. 

And over the past cou­ple of years, it feels like civ­il soci­ety has been almost over­whelmed with promis­es of how tech­nol­o­gy can sud­den­ly mag­i­cal­ly solve the prob­lems that we’re try­ing to address. Some com­ing from tech giants who say they’ve devel­oped some sem­blance of social con­science sud­den­ly. Some from star­tups who see a prob­lem and think that tech­nol­o­gy can help with­out think­ing about the sys­temic issues under­ly­ing it. And they come with mon­ey, and resources, and expertise. 

Compare that to the atti­tudes that we have here. So we are a group of peo­ple at Data & Society, and aca­d­e­mics and researchers who are think­ing deeply, crit­i­cal­ly about the role that tech­nol­o­gy and data is hav­ing in the world. Groups of aca­d­e­mics from all sorts of dis­ci­plines are build­ing up an ever-growing body of research around the social impli­ca­tions of tech­nol­o­gy. And we’re not alone. We’re learn­ing from each oth­er and the world around us, and we have the oppor­tu­ni­ty to spread our ideas.

Ursula K. LeGuin wrote in her book The Dispossessed, The idea is like grass. It craves light, likes crowds, thrives on cross­breed­ing, and grows bet­ter for being stepped on.” The con­cept of ideas improv­ing with bet­ter diver­si­ty of peo­ple con­tribut­ing is not new. So then my ques­tion becomes, in terms of tech crit­i­cism, who is con­tribut­ing? As far as I can tell, by and large it’s not often activists or peo­ple work­ing for pos­i­tive social change explicitly.

One poten­tial expla­na­tion is writ­ten about in this post by Chris Olah and Shan Carter from Google Brain. They write about the idea of research debt,” essen­tial­ly the labor that a researcher needs to do in order to start con­tribut­ing to a field. They need to first under­stand what’s come before, climb this moun­tain of pre­vi­ous work. Then when they can con­tribute, they con­tribute by putting their work on top of this moun­tain. Which makes the moun­tain high­er for those who come after, and it makes it hard­er to climb. They use the term debt” here to reflect that in the short term though this might seem like a log­i­cal way to go about it, in the long term it’s not nec­es­sar­i­ly that log­i­cal or pro­gres­sive or use­ful, real­ly. Like tech­ni­cal debt for pro­gram­mers, it’s again a prac­tice that seems log­i­cal in the short term but might be prob­lem­at­ic in the long term.

I observed this debt a lot in the tech and social change space. Here, as I said there are aca­d­e­mics and researchers who are think­ing deeply and crit­i­cal­ly about tech­nol­o­gy and data. In many civ­il soci­ety orga­ni­za­tions there are peo­ple imple­ment­ing tech projects, work­ing with exist­ing plat­forms, who have no idea about the cri­tiques that exist. And it’s not that the peo­ple in those spaces aren’t crit­i­cal thinkers. Everyone work­ing in social jus­tice work or for social change, it’s all about ana­lyz­ing pow­er and under­stand­ing peo­ple and pol­i­tics and rela­tion­ships. It’s often almost the same kind of set of skills, but there’s a com­plete lack of exchange between these dif­fer­ent spaces. I think this is prob­lem­at­ic for a num­ber of rea­sons. I’m just going to go into two of them here.

Firstly, and this won’t be new to many of you, the neg­a­tive effects of tech­nol­o­gy are often first felt by those on the mar­gins of soci­ety. This has been true through­out his­to­ry of colo­nial­ism, med­ical tech­nolo­gies, war­fare tech­nolo­gies, any­thing. So as a result it’s cru­cial that those work­ing with mar­gin­al­ized com­mu­ni­ties, or those mem­bers of those com­mu­ni­ties, have the tools they need to be able to crit­i­cal­ly assess technologies. 

One of the ear­li­est uses of bio­met­ric tech­nolo­gies for iden­ti­fi­ca­tion was on Afghan refugees back in 2002. That’s fif­teen years ago. They were cross­ing the Afghanistan/Pakistan bor­der, and it was by a UN refugee agency, done in part­ner­ship with a tech com­pa­ny who was sell­ing them this won­der­ful prod­uct that was going to solve all their prob­lems. Very lit­tle atten­tion was paid at the time to the pri­va­cy rights of the refugees in ques­tion. The tech­nolo­gies were used on 4.2 mil­lion peo­ple across a peri­od of five years. Before being used on those 5.2 mil­lion peo­ple it had been test­ed on 300 people. 

From research car­ried out by a Katja Jacobsen in 2010, there were many prob­lems, as you can imag­ine. She doc­u­ment­ed them in a real­ly nice research paper. This is just one quote from some­one who end­ed up using and hav­ing to imple­ment these tech­nolo­gies, say­ing, Previously, you could doubt your own judg­ment,” before the machines, this iris recog­ni­tion will make it much bet­ter. How can they argue now, the machine can’t make a mistake.”

That ini­tia­tive was prob­a­bly, I have to imag­ine, imple­ment­ed by peo­ple with the very best of inten­tions. They were work­ing in the human­i­tar­i­an sec­tor. They prob­a­bly went into the sec­tor want­i­ng to make peo­ple’s lives bet­ter. To my mind it’s real­ly clear that they were lack­ing, and often still are, a crit­i­cal lens to view the tech­nolo­gies that they’re being sold on and asked to use. Without tools to crit­i­cal­ly assess and think about the role of tech­nol­o­gy in soci­ety, I wor­ry that civ­il soci­ety will end up per­pet­u­at­ing inequal­i­ty rather than dis­man­tling the pow­er struc­tures we seek to dismantle.

The sec­ond rea­son I find this lack of exchange kind of prob­lem­at­ic is that it’s far eas­i­er to poke holes and find prob­lems with projects and ideas and tech­nolo­gies than it is to build them. Sarah Watson talks about an evo­lu­tion of tech crit­i­cism that she calls con­struc­tive tech crit­i­cism.” In her words, it goes beyond intel­lec­tu­al argu­ments. It is embod­ied, prac­ti­cal, and acces­si­ble, and it offers frame­works for liv­ing with technology.

That’s exact­ly what peo­ple work­ing in civ­il soci­ety and for pos­i­tive social change are try­ing to do. We’re engag­ing with the prob­lems of soci­ety. We’re try­ing to build bet­ter futures and not just sug­gest alter­na­tive pos­si­bil­i­ties, but build them and make them hap­pen. This is a mas­sive chal­lenge, obvi­ous­ly, that needs a diverse set of peo­ple and skills and per­spec­tives in addi­tion to activists and prac­ti­tion­ers and orga­niz­ers. It needs con­struc­tive tech crit­ics, too.

So, how can we cre­ate these spaces for engage­ment? This would mean cre­at­ing spaces for peo­ple from diverse back­grounds and expe­ri­ence to con­tribute to research mean­ing­ful­ly. Not just being inter­vie­wees or par­tic­i­pants in a research project, but being the peo­ple who design the project or car­ry out the research. But going back to what I said ear­li­er about research debt, that path of engage­ment cur­rent­ly exists pri­mar­i­ly by ask­ing peo­ple who already car­ry a heavy bur­den to climb moun­tains rather than walk along bridges.

When peo­ple work­ing in civ­il soci­ety or peo­ple that I work with ask me, What should I read or what should I look at to try and under­stand these crit­i­cal per­spec­tives you talk about?” I often actu­al­ly don’t know what to sug­gest. I know that they have resource lim­i­ta­tions, time pres­sure, it needs to be acces­si­ble. And I’m not real­ly sure where those bridges are. 


So in response to this, and in response to see­ing how quick­ly mis­in­for­ma­tion spread in 2016, I’ve been think­ing a lot about dif­fer­ent ways to com­mu­ni­cate these ideas, not just in a way that tries to explain them as though I’m an expert at explain­ing things to lay peo­ple, but in a way that encour­ages inter­ro­ga­tion and encour­ages engage­ment, and allows peo­ple to actu­al­ly ques­tion what I’m say­ing. Together with my col­lab­o­ra­tive Mimi, who’s in the back, we devel­oped this zine as a play­ful way of help­ing peo­ple think crit­i­cal­ly about the role of infor­ma­tion in society. 

We want­ed it to be famil­iar. It’s a phys­i­cal arti­fact. We want­ed peo­ple to look at it and not think that we’re experts telling them things, but think, Hey, I could have prob­a­bly done this myself, too.” So it’s hand-written. We want­ed to give con­text as to why this should mat­ter, so each page has a his­tor­i­cal exam­ple that speaks to the con­text of the peo­ple that are read­ing it. So we’ve trans­lat­ed it into German and replaced the exam­ples with German exam­ples, and we’re doing the same for Spanish.

We want to keep explor­ing this idea so we’ve set up stu­dio, Small Format, where for now at least, you can buy the German and English ver­sions of the zine, and we’ll be using Small Format to con­tin­ue to explore play­ful and crit­i­cal print­ed inter­ven­tions for explor­ing data. So watch this space. 

In con­clu­sion, I believe that the role of dis­till­ing com­plex ideas into their sim­plest forms is a task that the tech crit­ic com­mu­ni­ty could get a lot bet­ter at. We need to be able to explain com­plex­i­ty in sim­ple ideas, and explain ideas not by rely­ing on knowl­edge of Western philoso­phers or the­o­rists but by sit­u­at­ing it with­in local con­text and lived experience. 

Last week at a con­fer­ence that Ingrid host­ed called Future Perfect, Ruha Benjamin talked about the val­ue of cre­at­ing knowl­edge not to con­vince peo­ple who we want to impress or per­suade of a cer­tain thing, but for our­selves and for peo­ple who need it. I’d ask you all to think care­ful­ly about who you’re pro­duc­ing your knowl­edge for and whether they’re the ones that could ben­e­fit the most from it. Thank you. 

Further Reference

Event page