We’ve already been through several situations where new technologies come along. The Industrial Revolution removed a large number of jobs that had been done by hand, replaced them with machines. But the machines had to be built, the machines had to be operated, the machines had to be maintained. And the same is true in this online environment.
Archive (Page 1 of 2)
What does it mean for human rights protection that we have large corporate interests—the Googles, the Facebooks of our time—that control and govern a large part of the online infrastructure?
There’s already a kind of cognitive investment that we make, you know. At a certain point, you have years of your personal history living in somebody’s cloud. And that goes beyond merely being a memory bank, it’s also a cognitive bank in some way.
Google just has to grow. It has to keep growing. But Google grows at its own peril. Google grew so much that what happened? It outgrew Google. Google had to become what? Alphabet. Now what is Alphabet? Alphabet is not Google. Alphabet is a holding company. So Google’s new business as Alphabet is to do what? It’s to buy and sell technology companies. So, once a company becomes just too big to flip anymore, it becomes a flipper of other companies.
I often try to tell people that Google is not providing information retrieval algorithms, it’s providing advertising algorithms. And that is a very important distinction when we think about what kind of information is available in these corporate-controlled spaces.
We’re losing our ability to forget the things that should be forgotten. Wait until you try to run for Senate or Congress, some of you in this room, and some pictures or text roll up.
This quote’s from Andy Warhol. He was looking at America and saying America’s different. He’s saying, “Well, Elizabeth Taylor’s drinking Coke and I’m drinking Coke and the bum on the street’s drinking Coke, and it’s all the same thing.” For the first time in history, mass market culture has allowed us all to enjoy the same thing. This is not champagne. The bum on the street can’t afford champagne.
How would we begin to look at the production of the algorithmic? Not the production of algorithms, but the production of the algorithmic as a justifiable, legitimate mechanism for knowledge production. Where is that being established and how do we examine it?