For a real­ly long time, I’ve been com­plete­ly obsessed with ghost sto­ries because they are these fas­ci­nat­ing cul­tur­al items which reveal the most, kind of as Tobias said, anx­i­eties and weird, inter­est­ing ways of think­ing. Where the voic­es in the sta­t­ic are com­ing from, where the pipes are creak­ing, can often tell us about these weird things. 

There’s a fan­tas­tic film called The Babadook (Which I real­ly hope most of you have seen. You should go and see it) where the ghost man­i­fests itself as grief, because when a wom­an’s part­ner dies, this ghost becomes a thing that she works out her own grief and her moth­er­hood from.

They real­ly reveal things about our­selves that we did­n’t even know was pos­si­ble. And I’m real­ly intrigued by using this as a way to explore tech­nol­o­gy, and in par­tic­u­lar data and algo­rithms. One of the sto­ries I like about this is a sto­ry that Houdini told. It’s actu­al­ly a real case in Victorian spir­i­tu­al­ism, where he put up a great big grand prize of £5,000, which is about €250,000 now, to prove the exis­tence of spir­its. There was one per­son who came for­ward who was his great­est oppo­nent, and she was called Mina Margery” Crandon, the Boston Medium. Through a series of real­ly elab­o­rate bells and whis­tles and levers, she tried to con­vince him that her dead broth­er was speak­ing through her, and Houdini being Houdini (he was a genius) just said, No, I’m not hav­ing any of that. That’s ridicu­lous because ghosts don’t exist. But I real­ly like the fact that you’re try­ing so hard with this tech­nol­o­gy.” Because he used tech­nol­o­gy to make great big illu­sions about things.

An ad for Honeywell, showing a frightened man flinching away from his desk as sparks fly in from someplace out of frame, ending in a glowing envelope

Now, fast for­ward a hun­dred years to the 1950s where the way that we sud­den­ly mar­ket­ed our tech­nol­o­gy was as sci­ence fic­tion or the future. And then thir­ty years lat­er, it became mag­ic, as Tobias says (We have some sim­i­lar­i­ties; we do work togeth­er quite close­ly.) in the case of this Honeywell ad. And as I men­tioned before, data and algo­rithms are often seen as a kind of mag­ic in that way. So in this case with Honeywell, this data stream, a very well-recognized data stream that we know, our emails, sud­den­ly becomes com­plete and utter mag­ic. This guy is clear­ly real­ly bugged out by this.

Because you don’t need to know how they work or what hap­pens, they just do. They come to you as mag­ic. That’s all you need to wor­ry about. This becomes wor­ry­ing when you have things like Apple’s advert which says that you are more pow­er­ful than you think. Our tech­nol­o­gy turns you into a magi­cian. You are able to do what­ev­er you want with this tiny com­put­er in your hands. However, it’s actu­al­ly a real­ly pow­er­ful obfus­ca­tion tech­nique, and it makes you think that you’re doing the mag­ic when in fact you’re just a com­po­nent in their sys­tem that you don’t have any own­er­ship over. You can be part of their sys­tem, on their terms. They are the magi­cians, they cast the spell, they tell you how you can get involved.

Now, Bruno Latour, who is a sci­ence philoso­pher, prob­a­bly one of the best-known ones, very very inter­est­ing French chap, came up with the idea of the black box, where you can see what goes in and what comes out, but not actu­al­ly how the deci­sions are made and what hap­pens. So, these opaque process­es that we can’t see remove our agency and don’t allow us to actu­al­ly have any abil­i­ty to see what goes on, and this can become real­ly prob­lem­at­ic when we tell these sto­ries and use these sto­ries to explore anx­i­eties around algo­rithms and data.

In the case of Rehtaeh Parsons, who was a young girl in 2013 who unfor­tu­nate­ly killed her­self after a quite relent­less cam­paign of cyber­bul­ly­ing from some of her class­mates, her pic­ture was used mas­sive­ly wide­ly on social media. It was every­where, because it was quite a highly-publicized death. Her fam­i­ly and friends, as you can see, were right­ly real­ly shocked about this because you don’t think that you’re child’s going to be an advert, because the the third-party algo­rithms that are used by Facebook’s adver­tis­ers found this image and turned it into an advert for sin­gles in her area. Because it just read it, through the meta­da­ta, as a woman 1824, sin­gle, in this par­tic­u­lar area in Canada.

Facebook apol­o­gized, but no one real­ly knew who to blame. Was it the algo­rithm? Is it the com­pa­ny? Is it the pro­gram­mer? Is it the per­son who at the very begin­ning of this entire process cre­at­ed that piece of code to fix a very local­ized solu­tion, which was how do I find, in this mas­sive bank of images, a woman 1824 who is single?” 

There are more and more of these ghosts sto­ries hap­pen­ing on Facebook, which wor­ry me as some­one who looks a lot at this, because you end up with the peo­ple who are the most sub­ject to this, the most vul­ner­a­ble, the peo­ple who per­haps want things to be kept to their own pace. So for instance you have preg­nan­cies that are out­ed on Facebook because you searched Google for preg­nan­cy test” or preg­nan­cy advice” and then your part­ner that you share the com­put­er with finds out with­out you hav­ing the chance to tell them. Or you have a child that’s out­ed by their par­ents because they share a com­put­er look­ing for advice about their sexuality.

Facebook’s On This Day” is a real­ly good exam­ple of this, in some ways. It’s some­thing that I call means-well tech­nol­o­gy,” where a tech­no­log­i­cal solu­tion is put into a soci­o­log­i­cal and cul­tur­al fric­tion or messi­ness, essen­tial­ly. All of our cul­tur­al stuff is there. And in this case, a very well-meaning ser­vice tries to make your expe­ri­ence in the grand vac­u­um of Facebook feel far more per­son­al but ends up alien­at­ing you because it does­n’t under­stand the con­text of the things that are thrown up.

This is an exam­ple of where an algo­rith­mic solu­tion for the bur­den of infor­ma­tion (because there’s absolute­ly loads of crap on Facebook; we all know that) caus­es social and cul­tur­al fric­tion. You end up not just being haunt­ed by weird things that you once post­ed on your friend’s page, but this tweet:

Facebook thought that Pierce Brosnan on a horse was a mean­ing­ful mem­o­ry that that per­son wants to be remind­ed of. But also you get the kind of slight­ly awful, uncom­fort­able weird­ness, which is ex-lovers and dead friends, and friend­ships that you kind of would rather for­get about sud­den­ly come up because algo­rithms do not know the con­text of a pho­to­graph. Machine learn­ing can tell you what it is, who it is, who it is in rela­tion to oth­er peo­ple, but not what it means to you. You have to con­tex­tu­al­ize these things. They don’t have our faulty method­ol­o­gy and con­tex­tu­al­iza­tion, and in this way these algo­rithms aren’t actu­al­ly very neu­tral. They’re very biased and prej­u­diced towards peo­ple who offer them and cre­ate them.

The Stone Tape the­o­ry is some­thing that I’ve been look­ing at as a way of kind of cop­ing and deal­ing with poten­tial data ghosts. The Stone Tape the­o­ry is the idea that an object or house can be a recorder of mem­o­ries or a recorder of things. And in the hor­ror genre, as in the case with this piece of film from the 1970s, which is in Britain, this record­ed sud­den­ly plays back a moment of extreme emo­tion­al trig­ger. So you have the grief, or the birth of some­one, or a mas­sive breakup, sud­den­ly caus­es these ghosts to reap­pear and to run hav­oc through the house. 

Because data­bas­es are becom­ing like stone tapes. When the right emo­tion­al trig­ger hits, the pol­ter­geist begins to take action. You don’t notice any of these algo­rith­mic breaks in Facebook until they do actu­al­ly break, and from then on you’re kind of screwed. Because you can see them but you can’t do any­thing about them, because you don’t know how to. Because the sys­tem is so opaque that you would­n’t even know where to start.

Which is why it’s real­ly hard to find them or design for them, because no design­er here or pro­gram­mer here can ever ful­ly antic­i­pate where they’re tech­nol­o­gy’s going to end up. And I’m obvi­ous­ly not expect­ing you guys to com­plete­ly run through every pos­si­ble option. But we should have more aware­ness that they exist beyond soft­ware fixes.

There’s anoth­er great ghost sto­ry (I use a lot of ghost sto­ries. For me they’re quite a com­fort­able and uncom­fort­able way of talk­ing about this stuff.) of the dop­pel­gänger. The dop­pel­gänger in clas­si­cal mythol­o­gy is an exact copy of you [that] you see moments before your death. And Edgar Allan Poe, as Tobias told me when we were rehears­ing this ear­li­er, did the most famous sto­ry of this and you should def­i­nite­ly go and check it out. And in this case, if we look at the future, when we start to see our­selves and see the break­ages and things, maybe we don’t want an assis­tant that we can’t con­trol. That kind of sig­nals poten­tial demise that we do have.

When I gave a talk very sim­i­lar to this of the ear­ly days of my think­ing about this in New York [video; text], I asked who is the exor­cist in this sit­u­a­tion? Who do we call on to get rid of the ghost? And I actu­al­ly real­ized that there isn’t one, and if there is there’s kind of no point in them being there. Because once you remove a tech­nol­o­gy which caus­es these prob­lems, or if you sun­set a ser­vice, the prob­lems it caused don’t sud­den­ly evap­o­rate. You’re still left with all the dam­age, so how do we go about reduc­ing the extent of the dam­age, when you’re not pay­ing atten­tion to the fact that your tech­nol­o­gy is enter­ing into a sys­tem not a vac­u­um? Your solu­tion is not the only solu­tion. It is enter­ing into a world of oth­er peo­ple’s solu­tions. It’s going to bump up against what oth­er peo­ple think is the right thing, and you have to be aware of the fact that you are not the per­son who’s going to be the answer of them.

Other peo­ple’s tech­nolo­gies hap­pen to us. Whenever we enter into a sys­tem, we deal with every­thing else that peo­ple throw at us. This is a quote from a friend of mine, Deb Chachra, who reap­pro­pri­at­ed the quote that Nicolas men­tioned ear­li­er, which is Any suf­fi­cient­ly advanced neglect is indis­tin­guish­able from mal­ice.” Most com­pa­nies, I hope, and tech­nol­o­gists, and design­ers, many of you obvi­ous­ly in this room, don’t delib­er­ate­ly want to be mali­cious in the tech­nol­o­gy that they’re mak­ing. However, if you don’t think about the fact that your solu­tion is not the only one, and is going to enter into a whole host of dif­fer­ent things, then you are going to end up caus­ing prob­lems and it might as well be mal­ice. Because new mytholo­gies are being writ­ten and sum­moned into real­i­ty through the dense and often real­ly unfor­giv­ing tide of inno­va­tion. And if any of you have seen Microsoft’s prod­uct vision videos, or any [of] mil­lions of Kickstarter videos that exist, and adver­tis­ing, these are the things that are telling us the future that we should have, that we deserve, that we could have, if we let the flood of inno­va­tion hap­pen unin­ter­rupt­ed and uncon­test­ed and unscrutinized.

This is a still from one of Microsoft’s prod­uct vision videos. There’s a quote form Kurt Vonnegut who says, Everything was beau­ti­ful, and noth­ing hurt,” or in this case noth­ing breaks. And these fic­tions and nar­ra­tives are being willed into being with these real­ly ide­al­ized users who are very easy to solve when a prob­lem comes up, they’re from your own bias­es and your own expe­ri­ences, which are rel­a­tive­ly nar­row, and they’re quite dumb, real­ly. They’re imag­ined by these peo­ple who want to do well. And we all do want to do well. I don’t think any­one in this room is…hopefully not an evil genius. If you are, where’s your white cat? And these nar­ra­tives become real­ly when we don’t real­ly pay atten­tion to where they could poten­tial­ly go wrong.

A lot of my work at FutureEverything and Changeist is kind of prepar­ing for these very uncer­tain futures and look­ing at the future through the lens of art, and design, and pri­mar­i­ly nar­ra­tive. As my col­league Scott Smith men­tioned when I talked to him about this before the talk, futures is the bones. It’s the thing that kind of is the skele­ton of stuff. And nar­ra­tive becomes the flesh. The nar­ra­tive is the thing that walks about with your tech­nol­o­gy, and walks around in your tech­nol­o­gy and has to deal with it. 

Because these imag­ined near-future fic­tions progress, we real­ly need to have these counter-narratives that push these ideals that per­haps might actu­al­ly cause peo­ple some prob­lems off course. We need to start break­ing them, because if we don’t break them, who will? It’ll be the peo­ple who are sub­ject to them. And threads need to come undone. As Near Future Laboratory’s Nick Foster talks about, we need to think about the future mun­dane and the bro­ken futures and the peo­ple that per­haps we don’t often design for. Because when we imag­ine the future of a prod­uct or a ser­vice, we can total­ly antic­i­pate in many ways a soft­ware bug or a hard­ware issue, but not where a tech­nol­o­gy that you’ve let out into the world might cause some­one dis­tress, or exclude them, or make their life harder. 

The great­est exam­ple of this that I like using is that Apple’s HealtKit in the first iter­a­tion did­n’t think that women track­ing their peri­ods was an impor­tant enough met­ric to include in their first iter­a­tion. They put it in the sec­ond one. Great! Thanks guys. But already women felt incred­i­bly exclud­ed from a sys­tem that was sup­posed to make them feel bet­ter. That’s a nar­ra­tive that’s not okay. An app update is not enough. As I men­tioned in the exor­cist exam­ple, the dam­age has already been done by that point.

So to start to cre­ate these sto­ries which are a bit one-key and bro­ken and weird can help us to become more resilient and problem-solving and far more empa­thet­ic, and think a lot more about the futures that per­haps oth­er peo­ple could be liv­ing with our tech­nol­o­gy. I want­ed to give a few exam­ples of tech­nol­o­gy being sub­ject to these bias­es that we might not nec­es­sar­i­ly think of.

This is a Scottish TV show called Burnistoun. These two guys are try­ing to use a voice recog­ni­tion ele­va­tor, and they’re say­ing in very Scottish accents, Eleven. Eleven. Eleven.” And this lift is just basi­cal­ly say­ing, I’m sor­ry, I do not rec­og­nize that com­mand,” because it was designed by Americans. And they did­n’t ever think that some­one with a heavy Scottish Glaswegian accent (I’m sor­ry to all Scottish peo­ple that I tried to repli­cate that.), that it does­n’t work for them. So you’re very much sub­ject to these bias­es, and these bias­es lock peo­ple out of your tech­nol­o­gy because you did­n’t think they would use it in that way.

Still from "Curious Rituals" by Near Future Laboratory

Still from Curious Rituals” by Near Future Laboratory

This is a great scene from Nicolas’ film, actu­al­ly, from Near Future Laboratory where he works, from Curious Rituals,” which I do rec­om­mend that you go and watch, where the user of a smart car shouts into a car pho­net­i­cal­ly rather than the way that this name is said, to rec­og­nize a name that’s not American and not English. So she has to kind of say Jer-al-do rather than Geraldo [Spanish pro­nun­ci­a­tion], which is the guy’s name.

And now back to a more con­tem­po­rary ghost sto­ry. I real­ly hope some of you have had the chance at least to probe into the weird world of Charlie Brooker’s Black Mirror. It’s kind of like a very mod­ern Twilight Zone, in some ways. In this par­tic­u­lar ghost sto­ry, this woman los­es her hus­band, unfor­tu­nate­ly, and her very very well-meaning friend says to her that there’s this fan­tas­tic ser­vice that you can use which will lit­er­al­ly bring him back to life using all of his data, his social media pro­files, his voice calls, every­thing pos­si­ble. Which, for a start, indi­cates a future where pri­vate com­pa­nies have access to every sin­gle bit of your data to do this, which is ter­ri­fy­ing enough.

But in this case, this ser­vice that tries to do well and tries to give you com­fort in a time of incred­i­ble grief actu­al­ly scares the hell out of this woman. She’s sit­ting on the end of a sofa here, but there’s a fan­tas­tic scene where she’s locked in the bath­room because she’s absolute­ly ter­ri­fied of this thing that’s not her hus­band, not this per­son. It’s a man­i­fes­ta­tion of him. He is the dop­pel­gänger. And he sym­bol­izes this anx­i­ety that we have around our data, where we start to see things and they creep out of the cracks and things.

Using nar­ra­tive futures as I am, which is a lot of what look at, is look­ing beyond trend reports, using them as an infor­mant, and hori­zon scan­ning, which what a lot of futur­ists use. And bring­ing in ethno­g­ra­phers and anthro­pol­o­gists and artists and crit­i­cal design­ers to come into these process­es to kind of pull them apart and break them. Because we need to cre­ate more sto­ries about poten­tial haunt­ings, where our data could cre­ate harm. Because although it might not at every sin­gle instance, know­ing that it could allows you to slow down and think twice. 

In the ear­ly days of your tech­nol­o­gy, way back before pro­to­typ­ing the prod­uct, process out the kind of futures it could have, not what you think it’s going to have. Because even if you think it might do some­thing weird, hand it to some­one else, give it to a dif­fer­ent diverse group and say, Okay, so we kind of broke it in this way, but we know that we’re not every­one. So maybe you guys have a go at it.” Because when you think about where some­one’s qual­i­ty of life is com­pro­mised because you did­n’t think it would ever be used in that way… There’s a real­ly inter­est­ing exam­ple that I always use about the idea that if you’re in a sup­port group and a child says to you, I think I might be gay. I want to look up infor­ma­tion about this,” and then a few forum posts lat­er he says, My par­ents have kicked me out because they found out from their Facebook adver­tis­ing that I’m gay. They’re not hap­py about it.” We can’t antic­i­pate for that, but know­ing these kind of nar­ra­tives do take place is real­ly impor­tant. Because we want to have a future where we don’t just try and design for the best pos­si­ble cir­cum­stance, because real­is­ti­cal­ly the world’s messy enough as it is. It’s not going to sud­den­ly clean up over the next few app updates. That’s ridiculous.

So tell more ghost sto­ries. Freak your­selves out a bit. Be a bit weird. Here’s Patrick Swayze. Thank you.

Still from the movie "Ghost" showing Patrick Swayze's character sitting behind his widow, as they form clay on a spinning wheel

Nicolas Nova: Thanks, Natalie. One of the ques­tions we got was about the haunt­ed algo­rithm thing. Can you elab­o­rate on that?

Natalie Kane: The haunt­ed algorithm?

Nova: Yeah, haunt­ed algo­rithm. Can there be a haunt­ed algorithm…

Kane: I think it’s [?] because algo­rithms are a very log­i­cal sys­tem. They know what’s true and false, but the prob­lem is they kind of have a pup­pet mas­ter behind them that choos­es the datasets that they choose to take from. They choose to deter­mine what’s true and what’s false. And then you think, hang on, this true and false might not be the same as some­one else, and they kind of cre­ate these weird gaps where they’re used by dif­fer­ent peo­ple in dif­fer­ent con­flict­ing sys­tems, and that’s where the ghosts and weird haunt­ings happen.

Nova: Can there be good haunt­ed machines or algo­rithms? Because it’s part of the fric­tion of every­day life to have fric­tion, to have things that break. That could be fun­ny, that could be orig­i­nal. Not all fric­tions are bad.

Kane: Yeah. A lot of fric­tions are quite amus­ing. There’s def­i­nite­ly a place for mag­ic as a col­league of ours, Ingrid Burrington, says, there’s a place where you can use it to explore weird­ness and explore strange things and be delight­ed by stuff. But there’s a prob­lem between—and Tobias men­tioned this—empowerment and enchant­ment. Enchantment kind of pulls the wool over your eye, but empow­er­ment gives you the abil­i­ty to do the mag­ic. There’s a course that Greg Borenstein runs at MIT that talks about design as using mag­ic. I’d like to explore a lit­tle bit more what they mean by that, because I like the idea that we still have a capac­i­ty and a place for mag­ic in the world and to be excit­ed and delight­ed by stuff. But it’s just mak­ing sure who casts the mag­ic and who gets to make that mag­ic is thought of.

Nova: Thank you very much.

Further Reference

Bio page for Natalie and ses­sion descrip­tion at the Lift Conference web site.

Natalie’s page about this pre­sen­ta­tion, includ­ing the acci­den­tal inspi­ra­tion for its title.