You’re not going to get a generation of people outraged that somebody’s reading their email like you would’ve in the 70s getting a generation of people outraged that you’re reading their snail mail.
Archive (Page 1 of 4)
What does it mean for human rights protection that we have large corporate interests—the Googles, the Facebooks of our time—that control and govern a large part of the online infrastructure?
Are there any limits to the connected workplace? Are there any concerns about the connected workplace? Is there any way in which you wouldn’t want either yourself or an employee to be connected? Are there any limits to the kinds of information we can gather in order to make our workforces more productive? In order to make our overall society more productive?
The Tyranny of Algorithms is obviously a polemical title to start a conversation around computation and culture. But I think that it helps us get into the cultural, the political, the legal, the ethical dimensions of code. Because we so often think of code, and code is so often constructed, in a purely technical framework, by people who see themselves as solving technical problems.
We know very little about complex financial systems and how systemic risk, as it’s called, is computed and how you would manage policies. And if you look back at the financial crisis, you can either say, as many economists do, “It all had to do with badly-designed rules,” which may be part of the story; it’s certainly part of the story. Or it may have to do with the interaction of those rules and human nature, like mortgage broker greed, optimism… And you see it not just in individuals who now have houses and foreclosure, but at the highest levels.
We all see the benefits of active safety systems in cars. But that same safety technology, if attacked, can actually allow you to immobilize a vehicle or even disable breaks while driving.
Imagine your privacy assistant is a computer program that’s running on your smartphone or your smartwatch. Your privacy assistant listens for privacy policies that are being broadcast over a digital stream. We are building standard formats for these privacy policies so that all sensors will speak the same language that your personal privacy assistant will be able to understand.
We have to be aware that when you create magic or occult things, when they go wrong they become horror. Because we create technologies to soothe our cultural and social anxieties, in a way. We create these things because we’re worried about security, we’re worried about climate change, we’re worried about threat of terrorism. Whatever it is. And these devices provide a kind of stopgap for helping us feel safe or protected or whatever.