[This pre­sen­ta­tion was in response to Tarleton Gillespie’s The Relevance of Algorithms”]

Good morn­ing, every­one. It is a com­plete plea­sure to be respond­ing to Tarleton [Gillespie] today because I have found his work so gen­er­a­tive over the last few years, and have enjoyed many con­ver­sa­tions where we’ve grap­pled with these sorts of ideas that Tarleton has raised in his talk this morn­ing.

And in fact, just lis­ten­ing to you today made me think about a whole new set of ques­tions. In par­tic­u­lar in fact, the anx­i­ety around the female body, which I think recurred three times in your talk, through abor­tion, through the Target preg­nan­cy case, and of course through biki­nis. I’m think­ing, well, that’s a whole oth­er sub­ject for a paper, is why that is such a focus of algo­rith­mic anx­i­ety.

But in actu­al fact there’s some­thing else I’d like to dig deep­er into today. This is your claim that we need to pay atten­tion to the way that algo­rithms may pro­duce par­tic­u­lar kinds of polit­i­cal effects. But what exact­ly do you mean by the polit­i­cal” here?

So what I would like to do, rather than walk­ing away from the realm of the­o­ry, which Tarleton offered this morn­ing, I’m actu­al­ly going to take us right back there. But from the per­spec­tive of polit­i­cal the­o­ry. And I’d like to think about the log­ics of what Tarleton describes as these cal­cu­lat­ed publics. I’m going to do this by break­ing it down into eight or so—depending how we go for time—scenes about life in cal­cu­lat­ed publics.

Scene 1

A woman is sit­ting in a chair with a lap­top on her knees, and she’s try­ing to buy some books for a con­fer­ence that she’s about to go to called Governing Algorithms.” When she tries to buy Tarleton’s book, Wired Shut, she finds that cus­tomers who bought this item also bought” James Boyle’s The Public Domain, William Patry’s How to Fix Copyright, and Biella Coleman’s Coding Freedom.

So she starts imag­in­ing who this group of imag­ined shop­pers might be. Are they inter­est­ed in the same top­ics as she is? Should she buy a book about reform­ing copy­right law, or should she buy a book which is an ethno­graph­ic account of the Debian com­mu­ni­ty? I mean, they seem like fair­ly dif­fer­ent top­ics.

So, who are these cus­tomers and what unites them in these par­tic­u­lar tastes? I think we can imag­ine some of these answers, but we can’t know for sure how Amazon has deter­mined them. In fact, even senior Amazon devel­op­ers may not be able to tell us exact­ly how these imag­ined com­mu­ni­ties of cus­tomers have been cre­at­ed and how it’s changed over time as mil­lions of books have been pur­chased, and mil­lions of pro­files have been updat­ed.

Algorithms sim­ply don’t always behave in pre­dictable ways. And this is why we have A/B test­ing, exten­sive, ran­dom­ized test­ing, to observe just how algo­rithms actu­al­ly behave in the field with large datasets. So Tarleton argues that algo­rithms, and I quote,

…not only struc­ture our inter­ac­tions with oth­ers as mem­bers of net­work publics, but they also traf­fic in these cal­cu­lat­ed publics that they them­selves pro­duce.” Thus Amazon is both invok­ing and claim­ing to know a pub­lic with which we’re invit­ed to feel an affin­i­ty even if they have noth­ing what­so­ev­er to do with the kind of pub­lic that we were orig­i­nal­ly seek­ing out.

So the woman at the lap­top types in a dif­fer­ent author’s name, Evgeny Morozov. And she’s told that cus­tomers who bought this item also bought, amongst oth­er things, Eric Schmidt’s The New Digital Age, and Kevin Kelly’s What Technology Wants. Are these books sim­i­lar? Well… Have peo­ple like her bought Morozov and Gillespie’s books togeth­er? Not that we know. Instead we’re shown a cal­cu­lat­ed pub­lic. But we don’t know the mem­ber­ship, their con­cerns, whether they loved or hat­ed these books. There’s sim­ply a con­sen­sus. These books and peo­ple are fre­quent­ly rep­re­sent­ed togeth­er.

Scene 2

Is talk­ing about the polit­i­cal ram­i­fi­ca­tions of algo­rithms enough? Or can we go a step fur­ther. McKenzie Wark argues that tech­nol­o­gy and the polit­i­cal are not sep­a­rate things. One is sim­ply look­ing at the same sys­tem,” he writes, through dif­fer­ent lens­es when one speaks of the polit­i­cal or the tech­ni­cal.”

Likewise, Alex Galloway notes that we shouldn’t focus so much on devices or plat­forms or algo­rithms and such, and more on the sys­tems of pow­er that they mobi­lize. So let’s speak for a moment about algo­rithms as polit­i­cal the­o­ry, and vice ver­sa.

Some thinkers think about algo­rithms as being essen­tial­ly auto­crat­ic sys­tems. We have no input, they make the deci­sions, and we don’t get to see the process­es by which those deci­sions are made. Barbara Cassin has described, on the oth­er hand, how algo­rithms like PageRank appear to have a more delib­er­a­tive, demo­c­ra­t­ic, ethos. And I quote, using graph the­o­ry to val­orize pure het­ero­gene­ity, show­ing how qual­i­ty is an emer­gent prop­er­ty of quan­ti­ty.”

But what about alter­na­tive polit­i­cal frame­works to this autoc­ra­cy vs. delib­er­a­tive democ­ra­cy? What if say for exam­ple we start­ed to think ago­nis­tic plu­ral­ism? Which is to way we start with the premise of ongo­ing strug­gle (ago­nism), between dif­fer­ent groups and enti­ties (plu­ral­ism), and rec­og­nize that com­plex, shift­ing nego­ti­a­tions are occur­ring between peo­ple, insti­tu­tions, and algo­rithms, all the time and that they’re act­ing in rela­tion to each oth­er?

Scene 3

Chantal Mouffe is being inter­viewed for a polit­i­cal mag­a­zine. She’s sit­ting on a very large, com­fy chair. She’s asked, How do you define democ­ra­cy, if not as a con­sen­sus?” In response she describes the dif­fer­ence between the mod­el of tra­di­tion­al democ­ra­cy, and her notion of ago­nis­tic plu­ral­ism. And I quote,

I use the con­cept of ago­nis­tic plu­ral­ism to present a new way to think about democ­ra­cy, which is dif­fer­ent from the tra­di­tion­al lib­er­al con­cep­tion of democ­ra­cy as a nego­ti­a­tion between inter­ests. While they have many dif­fer­ences, Rawls and Habermas have in com­mon the idea that the aim of a demo­c­ra­t­ic soci­ety is the cre­ation of a con­sen­sus, and that con­sen­sus is pos­si­ble if peo­ple are only able to leave aside their pas­sions and their par­tic­u­lar inter­ests and think like ratio­nal beings. However, while we desire an end to con­flict, if we want peo­ple to be free we must always allow for the pos­si­bil­i­ty that con­flict may appear and to pro­vide an are­na where dif­fer­ences can be con­front­ed.
Mouffe, C. (2000) The Democratic Paradox.

Scene 4

New York City. There’s a group chat hap­pen­ing in the Reddit office. The dis­cus­sion is about what they call their hot sort­ing algo­rithm,” and how posts in some areas of the site become front‐page sto­ries with rel­a­tive­ly few upvotes.

At the same time, in Southern Nebraska, a group of fif­teen year‐old girls meet at a friend’s house to dis­cuss how they could influ­ence, or game, Reddit’s sys­tem. They aim to band peo­ple togeth­er to upvote a sto­ry of homo­pho­bic harass­ment of a boy at their school in order to gain media atten­tion and shame the per­pe­tra­tors.

Scene 5

Can an algo­rithm be ago­nis­tic? So, algo­rithms may be rule‐based mech­a­nisms, but they’re also, and some­times we could argue, gov­ern­ing agents, that are choos­ing between com­pet­ing and some­times con­flict­ing data objects.

So if algo­rithms present us with a new knowl­edge log­ic, as Tarleton has con­vinced us, then it’s impor­tant to con­sid­er the con­tours of that log­ic. What are the his­to­ries and philoso­phies that have most strong­ly shaped them?

Certainly I think it’s going to be dif­fi­cult to describe any of these algo­rithms as ago­nis­tic. So much of the messy busi­ness of choos­ing between par­tic­u­lar kinds of data points is essen­tial­ly hid­den from us, whether it’s search results, which books are sold togeth­er, which news sto­ries are most rel­e­vant to us.

Much of the algo­rith­mic work of pick­ing win­ners between infor­ma­tion con­tests is going to remain invis­i­ble. Yet these delib­er­a­tions are cru­cial. This is the stuff of lower‐g gov­er­nance. What Maurizio Lazzarato describes as, the ensem­ble of tech­niques and pro­ce­dures put into place to direct the con­duct of men and to take account of the prob­a­bil­i­ties of their action and rela­tions.”

Scene 6

If the pol­i­tics of many of these algo­rithms is com­mon­ly locat­ed on a spec­trum between autoc­ra­cy and delib­er­a­tive democ­ra­cy, I think we could start to dis­cuss the lim­i­ta­tions of those approach­es. In Mouffe’s words, when we accept that every con­sen­sus exists as a tem­po­rary result of a pro­vi­sion­al hege­mo­ny as a sta­bi­liza­tion of pow­er that always entails some form of exclu­sion, we can begin to envis­age the nature of a demo­c­ra­t­ic pub­lic sphere in a dif­fer­ent way.”

And so I think we reach her strongest argu­ment for why think­ing about ago­nism is impor­tant. This is why a plu­ral­ist democ­ra­cy, she writes, needs to make room for dis­sent, and for the insti­tu­tions through which it can be man­i­fest­ed. It’s sur­vival depends on col­lec­tive iden­ti­ties form­ing around clear­ly dif­fer­en­ti­at­ed posi­tions, as well as on the pos­si­bil­i­ty of choos­ing between real alter­na­tives.” And I think that’s a fair­ly key con­cept here.

So this is why it mat­ters whether algo­rithms can be ago­nist, giv­en their roles in gov­er­nance. When the log­ic of algo­rithms is under­stood as auto­crat­ic, we’re going to feel pow­er­less and pan­icked because we can’t pos­si­bly inter­vene. If we assume that they’re delib­er­ate­ly demo­c­ra­t­ic, we’ll assume an Internet of equal agents, ratio­nal debate, and emerg­ing con­sen­sus posi­tions, which prob­a­bly doesn’t sound like the Internet that many of us actu­al­ly rec­og­nize.

So instead, per­haps if we start­ed to think about this idea of ago­nis­tic plu­ral­ism, we might start to think about the way in which algo­rithms are choos­ing from coun­ter­posed per­spec­tives with­in a field where ratio­nal­i­ty and emo­tion are giv­en. As an ethos, it assumes per­pet­u­al con­flict and con­stant con­tes­ta­tion. It would ide­al­ly offer the path to choose, I think, away from these dis­ap­point­ing­ly lim­it­ed calls for trans­paren­cy” in algo­rithms, which are ulti­mate­ly kind of doomed to fail, giv­en that com­pa­nies like Facebook and Twitter are not going to give their algo­rithms away, for a whole host of com­pet­i­tive rea­sons, and also because they’re afraid of users gam­ing the sys­tem.

Instead, I think to rec­og­nize val­ue of dif­fer­ent per­spec­tives and oppos­ing inter­ests involves an accep­tance of what Howarth calls the rules of the game” and an under­stand­ing that algo­rithms are par­tic­i­pants in wider insti­tu­tion­al and cap­i­tal­ist log­ics.

Scene 7

Where else do we find ago­nism in the field of algo­rithms? Perhaps the prob­lem here is actu­al­ly the fetishiz­ing of algo­rithms them­selves, with­out widen­ing the per­spec­tive to include the many ways in which algo­rithms are not sta­ble, and they’re always in rela­tion to oth­er peo­ple. That is, they’re in flux and they’re embed­ded in hybrid sys­tems.

For exam­ple, we can look to the offices and the spaces where devel­op­ers are cur­rent­ly com­ing up with algo­rithms, and I think this is where Nick Seaver’s work is real­ly use­ful, where he’s actu­al­ly spend­ing time with peo­ple who are design­ing music rec­om­men­da­tion algo­rithms. We could also look at the spaces where peo­ple and algo­rithms are actu­al­ly play­ing par­tic­u­lar kinds of games. I’m think­ing of Reddit here and the fact that Reddit makes much of its algo­rith­mic process pub­lic.

And I think peo­ple actu­al­ly like the way that they can see some of the rules of the game and at least imag­ine how they might game them. It offers a kind of legit­i­ma­cy that I think some of these more closed, opaque sys­tems like Facebook instead pro­duce a kind of sus­pi­cion.

Or we could even look to the ways that peo­ple are cur­rent­ly reverse‐engineering algo­rithms, where the troll and the hack­er become key play­ers in an ago­nis­tic sys­tem. So by using this wider optic, I think we can see that algo­rithms are always work­ing in con­test­ed, human spaces.

Scene 8

The final word has to go to Tarleton.

In attempt­ing to say some­thing of sub­stance about the way algo­rithms are shift­ing our pub­lic dis­course, we must firm­ly resist putting the tech­nol­o­gy in the explana­to­ry driver’s seat. While recent soci­o­log­i­cal stud­ies of the Internet have tried to undo all the sim­plis­tic tech­no­log­i­cal deter­min­ism that plagued ear­li­er work, that deter­min­ism still remains as a fair­ly attrac­tive ana­lyt­ic stance. Our analy­sis must not con­ceive of algo­rithms as abstract tech­ni­cal achieve­ments, but must unpack the warm, human, and insti­tu­tion­al choic­es that lie behind these cold mech­a­nisms.

I take this as a use­ful reminder that we need to look beyond algo­rithms as kind of fetish objects, to con­sid­er the devel­op­ers in their cubi­cle farms, the teenage hack­ers who are play­ing on Reddit, the Amazon book buy­ers, and the mul­ti­tude of flesh and blood scenes where humans and algo­rithms engage.

Thanks.

Further Reference

The Governing Algorithms conference site with full schedule and downloadable discussion papers.

A special issue of the journal Special Issue of Science, Technology, & Human Values, on Governing Algorithms was published January 2016, including a final version of this presentation's paper.


Help Support Open Transcripts

If you found this useful or interesting, please consider supporting the project monthly at Patreon or once via Square Cash, or even just sharing the link. Thanks.