We have increasingly smart, surveillant persuasion architectures. Architectures aimed at persuading us to do something. At the moment it’s clicking on an ad. And that seems like a waste. We’re just clicking on an ad. You know. It’s kind of a waste of our energy. But increasingly it is going to be persuading us to support something, to think of something, to imagine something.
Archive (Page 1 of 2)
[The] question of what happens when blackness enters the frame can kind of neatly encapsulate the ways I’ve been thinking and trying to talk about surveillance for the last few years.
The United States plants more than 170 million acres of corn and soybeans a year, more than any country in the world. And the primary mechanism in the US that we use to subsidize agriculture is actually called the Federal Crop Insurance Program. So, the crop insurance program in the US is also the largest such program globally, with over $100 billion in liabilities annually. So it’s a very big program.
There’s already a kind of cognitive investment that we make, you know. At a certain point, you have years of your personal history living in somebody’s cloud. And that goes beyond merely being a memory bank, it’s also a cognitive bank in some way.
A lot of the science fiction I love the most is not about these big questions. You read a book like The Diamond Age and the most interesting thing in The Diamond Age is the mediatronic chopsticks, the small detail that Stephenson says okay, well if you have nanotechnology, people are going to use this technology in the most pedestrian, kind of ordinary ways.
When I go talk about this, the thing that I tell people is that I’m not worried about algorithms taking over humanity, because they kind of suck at a lot of things, right. And we’re really not that good at a lot of things we do. But there are things that we’re good at. And so the example that I like to give is Amazon recommender systems. You all run into this on Netflix or Amazon, where they recommend stuff to you. And those algorithms are actually very similar to a lot of the sophisticated artificial intelligence we see now. It’s the same underneath.
The Tyranny of Algorithms is obviously a polemical title to start a conversation around computation and culture. But I think that it helps us get into the cultural, the political, the legal, the ethical dimensions of code. Because we so often think of code, and code is so often constructed, in a purely technical framework, by people who see themselves as solving technical problems.