A lot of the science fiction I love the most is not about these big questions. You read a book like The Diamond Age and the most interesting thing in The Diamond Age is the mediatronic chopsticks, the small detail that Stephenson says okay, well if you have nanotechnology, people are going to use this technology in the most pedestrian, kind of ordinary ways.
Archive (Page 2 of 3)
When I go talk about this, the thing that I tell people is that I’m not worried about algorithms taking over humanity, because they kind of suck at a lot of things, right. And we’re really not that good at a lot of things we do. But there are things that we’re good at. And so the example that I like to give is Amazon recommender systems. You all run into this on Netflix or Amazon, where they recommend stuff to you. And those algorithms are actually very similar to a lot of the sophisticated artificial intelligence we see now. It’s the same underneath.
The Tyranny of Algorithms is obviously a polemical title to start a conversation around computation and culture. But I think that it helps us get into the cultural, the political, the legal, the ethical dimensions of code. Because we so often think of code, and code is so often constructed, in a purely technical framework, by people who see themselves as solving technical problems.
I often try to tell people that Google is not providing information retrieval algorithms, it’s providing advertising algorithms. And that is a very important distinction when we think about what kind of information is available in these corporate-controlled spaces.
This quote’s from Andy Warhol. He was looking at America and saying America’s different. He’s saying, “Well, Elizabeth Taylor’s drinking Coke and I’m drinking Coke and the bum on the street’s drinking Coke, and it’s all the same thing.” For the first time in history, mass market culture has allowed us all to enjoy the same thing. This is not champagne. The bum on the street can’t afford champagne.
How would we begin to look at the production of the algorithmic? Not the production of algorithms, but the production of the algorithmic as a justifiable, legitimate mechanism for knowledge production. Where is that being established and how do we examine it?
It seems to me that to confront algorithms on their own terms, we may have to modify our preoccupation with the politics of knowledge and take up an interest in the politics of logistical engineering.
This is why it matters whether algorithms can be agonist, given their roles in governance. When the logic of algorithms is understood as autocratic, we’re going to feel powerless and panicked because we can’t possibly intervene. If we assume that they’re deliberately democratic, we’ll assume an Internet of equal agents, rational debate, and emerging consensus positions, which probably doesn’t sound like the Internet that many of us actually recognize.