Kate Darling: You know, increas­ing­ly we’re using auto­mat­ed tech­nol­o­gy in ways that kind of sup­port humans in what they’re doing rather than just hav­ing algo­rithms work on their own, because they’re not smart enough to do that yet or deal with unex­pect­ed sit­u­a­tions. So we’re inter­act­ing more and more with these sys­tems, and so you need an inter­face that it makes sense for a human to inter­act with. And so a lot of these sys­tems have an inter­face that is per­son­i­fied and that will talk to you. 

The effect that I’m so inter­est­ed in, the psy­cho­log­i­cal effect of us treat­ing these sys­tems like social actors, is actually…we achieved that in the 60s. Like in the 60s we had this chat­bot called ELIZA that Joseph Weizenbaum made. It’s a real­ly famous exam­ple of a real­ly sim­ple chat­bot. All it did was answer every­thing with a ques­tion, like a psy­cho­an­a­lyst would. So if you were like, Oh, I don’t like my mom,” it would be like, Why don’t you like your mom?” And peo­ple would just open up and tell it all sorts of things even though it was very prim­i­tive in how it behaved.

So it’s use­ful that we will engage with these sys­tems on a social lev­el, because it’s engag­ing for peo­ple and you can get peo­ple to actu­al­ly use the sys­tems more. But I do won­der whether there’s any effect on our behav­ior, on our long-term behav­ioral devel­op­ment.

It real­ly gets inter­est­ing when we start talk­ing about kids. You have these sys­tems like Siri and Alexa that kids are inter­act­ing with. And if these sys­tems are sim­u­lat­ing life­like behav­ior or a real con­ver­sa­tion, then that could actu­al­ly influ­ence kids’ behav­ioral devel­op­ment and the way that they start to con­verse with oth­er peo­ple. There are some exam­ples of this, just anec­do­tal­ly, that we have so far. Like there was an arti­cle in the New York Times a few years ago about this kid, this autis­tic boy, who devel­oped a rela­tion­ship with Siri. And the mom was like this is the best thing ever. Like, Siri is infi­nite­ly patient, will answer all of his ques­tions. Also the voice recog­ni­tion is so shit­ty that he’s had to learn to artic­u­late his words real­ly well, and it’s made him a bet­ter com­mu­ni­ca­tor with oth­er peo­ple because of Siri.

But, then on the oth­er hand you have sto­ries like this one guy wrote an arti­cle a few months ago about how Alexa was turn­ing his kids into ass­holes because she doesn’t require any please or thank you or any of the stan­dard polite­ness that you want your kids to learn in con­vers­ing with oth­ers. So you know, I think think it is an inter­est­ing ques­tion. The more we’re able to sim­u­late a real con­ver­sa­tion with these sys­tems, the more that might get mud­dled in our sub­con­scious in some way.

One of the things that I think is real­ly nec­es­sary is that we need to study more what impact this can have on our behav­ior. So we def­i­nite­ly need to be study­ing these inter­ac­tions and study­ing the effects of these inter­ac­tions. It’s a sim­i­lar ques­tion to vio­lent video games or pornography—these are ques­tions that have come up again and again, but I think that these sys­tems bring them to a new, more vis­cer­al lev­el in our psy­chol­o­gy. So we need more research, and then depend­ing on what the answer to that spe­cif­ic ques­tion is, we might need to think about design, and use, and maybe even pol­i­cy for reg­u­lat­ing these agents.

Part of the prob­lem is also that this is such an inter­dis­ci­pli­nary prob­lem, or all of the ques­tions that are com­ing up are so inter­dis­ci­pli­nary. You need tech­nol­o­gists; you need pol­i­cy­mak­ers; you need users; you need researchers who under­stand psy­chol­o­gy, who under­stand tech­nol­o­gy, who under­stand soci­ol­o­gy. All of this is com­ing togeth­er and you need peo­ple talk­ing to each oth­er.

I would real­ly like us to in the next few years lean into the pos­i­tive effects of the tech­nol­o­gy and devel­op struc­tures the way that we have in oth­er fields like med­i­cine to ensure that we’re using the tech­nol­o­gy in a respon­si­ble, eth­i­cal way.


Help Support Open Transcripts

If you found this useful or interesting, please consider supporting the project monthly at Patreon or once via Square Cash, or even just sharing the link. Thanks.