When you make a decision to opt for an automated process, to some extent you’re already by doing so compromising transparency. Or you could say it the other way around. It’s possible to argue that if you opt for extremely strict transparency regulation, you’re making a compromise in terms of automation.
Blockchain is in that space where we still have to explain it, because most of the people have gone from not having it around to having it around. But for kind of the folks that are your age or a little younger it’s kind of always been there, at which point it doesn’t really need to be explained. It does however need to be contextualized.
Everybody thinks of bureaucrats as being kind of a neutral force. But I’m going to make the case that bureaucrats are in fact a very strongly negative force, and that automating the bureaucratic functions inside of our society is necessary for further human progress.
In a world of conflicting values, it’s going to be difficult to develop values for AI that are not the lowest common denominator.
We want to sort of bring you all up to speed on some of the things that we’ve been thinking about, some of the conversations we’ve been having that I’ve had to edit out of the tail ends of episodes, link a few concepts and also be… Well, first because we think it’s really important to be sort of transparent about where we’re going with the series and the conversations we’re having.
This is why it matters whether algorithms can be agonist, given their roles in governance. When the logic of algorithms is understood as autocratic, we’re going to feel powerless and panicked because we can’t possibly intervene. If we assume that they’re deliberately democratic, we’ll assume an Internet of equal agents, rational debate, and emerging consensus positions, which probably doesn’t sound like the Internet that many of us actually recognize.