[This ses­sion con­sist­ed of Lee Konstantinou read­ing a short sto­ry, pub­lished on his own site as The (Tyrannical) Lives of Algorithms.” What fol­lows is a brief dis­cus­sion after­ward, between him and Ed Finn.]


Ed Finn: Thank you, Lee. That was won­der­ful. Okay, so is it ghosts all the way down?

Lee Konstantinou: Uh…it’s his­to­ry all the way down, I guess was the… If that sto­ry had a the­sis (I would deny it if brought into court), I would say that it’s…ghosts are a fig­ure for path depen­den­cy, for locked-in his­tor­i­cal process­es. And the pre­vi­ous pan­el I think actually—you know the sort of irony is in fact the pre­vi­ous pan­el was quite smart on these sub­jects, and I think the idea that we’re sur­round­ed by these machines that we do not under­stand is some­thing akin to being haunted.

Finn: Yeah, I think it’s a real­ly com­pelling metaphor. And it gets into a lot of the stuff that we talked about in that last pan­el, that we essen­tial­ly cre­ate these mys­ti­cal or spir­i­tu­al nar­ra­tives around some of these sys­tems. And I won­der if that’s inevitable or escapable. I don’t know if you have a thought on that.

Konstantinou: You were men­tion­ing in the pre­vi­ous pan­el, I think you’re right to note that the log­ic of say, sta­tis­ti­cal analy­sis, or the log­ic of sci­en­tif­ic inquiry, does not nec­es­sar­i­ly fol­low a nar­ra­tive log­ic. And when you’re nar­rat­ing a sto­ry, you need actors or agents to per­form actions. And our kind of rhetoric of ghosts, or the rhetoric of gods, has a very…you know, to talk about path depen­den­cy, there’s a very long his­to­ry of talk­ing that way. And we inher­it our lan­guage in part, and are stuck, I think, with a lot of these fig­ures. And maybe becom­ing con­scious about them and how they work, rip­ping our­selves from the famil­iar uses of such terms, can be part of what his­to­ry does. Or learn­ing about his­to­ry does.

Finn: I was real­ly struck as well with the notion of human drag, and the ways in which already Siri does human drag some­times, right? When you do these jokes, or you watch the com­mer­cial where Siri’s talk­ing to Zooey Deschanel or some­body, and they’re hav­ing this live­ly, wit­ty, con­ver­sa­tion. And you try and do that, it’s not going to work. Unless you try real­ly hard to sum­mon that ghost and learn all the lines for both sides of the conversation. 

But yeah, I won­der if you could reflect a lit­tle bit more on that notion of putting on a per­sona, that algo­rithms might go into human drag, but also that we are occa­sion­al­ly going into these sort of mixed or cyborg or com­pu­ta­tion­al per­for­mances as well.

Konstantinou: I don’t know. I mean… You know, I’m fas­ci­nat­ed by I guess the recent career of like, Scarlett Johannson, and the sort of cast­ing. Some cast­ing direc­tor some­where is con­vinced that she is the ulti­mate fig­ure for the posthu­man, or the non-human, you know. And so there was Her, there was Under the Skin, and then that ter­ri­ble but fas­ci­nat­ing film Lucy, where she plays some­one who ends up using 100% of her brain. 

And so I do think there are moments when you can say things like humans are increas­ing­ly asked to behave like machines. And this was the fear with say, or the cri­tique of Taylorism, right. Like these sort of man­age­ment sys­tems that force peo­ple to behave in cer­tain ways. 

But I think more fre­quent­ly what we’re end­ing up with are machines that are being designed to put us at ease, to to make us relax. And to ignore them, effec­tive­ly. And so for me the impor­tant thing when I was com­pos­ing the sto­ry was think­ing about sort of every­day life, or that lev­el of the algo­rithm. A lot of the sci­ence fic­tion I love the most is not about these big ques­tions. You read a book like The Diamond Age and the most inter­est­ing thing in The Diamond Age is the medi­a­tron­ic chop­sticks, the small detail that Stephenson says okay, well if you have nan­otech­nol­o­gy, peo­ple are going to use this tech­nol­o­gy in the most pedes­tri­an, kind of ordi­nary ways.

Finn: Yeah. It’s sort of the Louis C.K. argu­ment, right? That ten sec­onds after we got Internet access on air­planes, we start­ed com­plain­ing about how ter­ri­ble the Internet access on the air­planes was.

And so, right. Maybe right now we’re con­fronting a near future where com­pu­ta­tion is becom­ing more and more vis­i­ble, more and more present. But then at a cer­tain point it’s going to start to disappear. 

Konstantinou: I mean, I think that’s already… Yeah. You not­ed like with Siri and with oth­er sys­tems like this, it’s already hap­pen­ing. And to some degree, most— Like, I give my lap­top to my par­ents, for instance, and they’re not sure what to do with it. But I give them the iPad and they seem to have a kind of intu­itive sense of how to use it. And so I think it’s true that increas­ing­ly the kinds of sys­tems that will dom­i­nate our lives are the sys­tems that are stu­dious­ly kept from our view, in some ways.

Finn: Lee, thank you so much.