We are here to talk about fucking machines. In London, on a foggy evening, on a Tuesday, for yet another debate about fucking machines. Another curated discussion underlined by our own human insecurity about versions of us in silica. Fucking anthropomorphic fucking machines. Machines that fuck us. And let’s face it, machines are already fucking us, or so we seem to be told.
Archive (Page 1 of 3)
The question is what are we doing in the industry, or what is the machine learning research community doing, to combat instances of algorithmic bias? So I think there is a certain amount of good news, and it’s the good news that I wanted to focus on in my talk today.
For any artists that are working in this field now, if I was good at painting I’d probably be looking at how to find styles that work well with these kind of representations and make them easily automatable or transferable so that if I had fans as an artist they could say, “Hey, I would like to have a picture of my cat painted.”
If you think about it what we’re doing is we’re turning very high-dimensional mathematic representations of a sort of large knowledge space into intellectual property. Which should be the most frightening idea in the world to anyone. This is from most abstract thing you could possibly try and turn into a capitalist object.
Computers can tell stories but they’re always stories that humans have input into a computer, which are then just being regurgitated. But they don’t make stories up on their own. They don’t really understand the stories that we tell. They’re not kind of aware of the cultural importance of stories. They can’t watch the same movies or read the same books we do. And this seems like this huge missing gap between what computers can do and humans can do if you think about how important storytelling is to the human condition.
This is a moment to ask as we make the planet digital, as we totally envelop ourselves in the computing environment that we’ve been building for the last hundred years, what kind of digital planet do we want? Because we are at a point where there is no turning back, and getting to ethical decisions, values decisions, decisions about democracy, is not something we have talked about enough nor in a way that has had impact.
Victor’s sin wasn’t in being too ambitious, not necessarily in playing God. It was in failing to care for the being he created, failing to take responsibility and to provide the creature what it needed to thrive, to reach its potential, to be a positive development for society instead of a disaster.
Increasingly we’re using automated technology in ways that kind of support humans in what they’re doing rather than just having algorithms work on their own, because they’re not smart enough to do that yet or deal with unexpected situations.
I teach my students that design is ongoing risky decision-making. And what I mean by ongoing is that you never really get to stop questioning the assumptions that you’re making and that are underlying what it is that you’re creating—those fundamental premises.
If you have a system that can worry about stuff that you don’t have to worry about anymore, you can turn your attention to other possibly more interesting or important issues.