So, when I think of the cyber state, I might be think­ing of the usu­al stuff, which is gov­ern­ments of the world and their intel­li­gence agen­cies gath­er­ing infor­ma­tion on us, direct­ly or indi­rect­ly, through rela­tions or com­pul­sion with pri­vate com­pa­nies. But I want to think more broad­ly about the future of cyber state, and think about accu­mu­la­tions of pow­er both cen­tral­ized and dis­trib­uted that might require trans­paren­cy in bound­aries we would­n’t be used to. Anything rang­ing from Google Glass or what­ev­er its suc­ces­sors might be, so that we could find our­selves live stream­ing out of this room at all times and have that teleme­try auto­mat­i­cal­ly cat­e­go­rized as to with whom we’re hav­ing conversations. 

And imag­ine even the state then turn­ing around and say­ing in the event of some­thing like the Boston Marathon bomb­ing or in Paris, Anybody near this event is going to have teleme­try that we will now request from patri­ots,” or demand from those are a lit­tle chary, and see just how much data can be accu­mu­lat­ed from a peer-to-peer kind of perspective.

But I think it won’t just be the state are get­ting assist­ed from it, it’ll be chal­lenged, too. This is an app called Sukey, in which demon­stra­tors can put into the app where the police are as they’re demon­strat­ing, and the app will then show you the still-open exits to wher­ev­er you are that the police had­n’t got­ten around to clos­ing off yet. And that’s the kind of thing that wor­ries the authorities.

But there’s a third dimen­sion, too. And that is, how are these folks decid­ing to con­vene for a protest? It might be through some­thing like Facebook or Twitter, or oth­er social media. In this won­der­ful exam­ple, Facebook was able in the 2010 US con­gres­sion­al elec­tions to get more peo­ple to vote by putting noti­fi­ca­tions in their news feeds of where the polling places were. Now, they could do it selec­tive­ly, and they’ve got a lot of data by which to do that.

Here’s Facebook able to pre­dict when two Facebook users are about to enter into a roman­tic rela­tion­ship, before either of the peo­ple appar­ent­ly knows. And you could imag­ine Facebook, hypo­thet­i­cal­ly, tweak­ing the feeds to encour­age the peo­ple to come togeth­er by show­ing their good sides. Or maybe if they don’t belong togeth­er, push­ing them apart, and they shall nev­er be.

In this high­ly con­tro­ver­sial study, there was—maybe not surprising—evidence that if your feed is filled with hap­py things you get hap­pi­er, and if it has unhap­py things you get unhap­pi­er. Which if you think about it, espe­cial­ly for those in a younger gen­er­a­tion for whom they’re get­ting their infor­ma­tion from Facebook and Twitter, et al, their hap­pi­ness lev­el, whether they feel the world is falling apart or com­ing togeth­er, can be deter­mined through these sort of recon­dite algorithms.

And you could, see some, day Facebook being chal­lenged. Maybe we should put more Ice Bucket Challenge and less a lit­tle bit of stuff com­ing from Ferguson, Missouri if there’s unrest there, in the name of pub­lic safe­ty. Thinking about how that should be dealt with.

Now, the data that will be used to do this isn’t always sol­id. Here you can see a very high cor­re­la­tion between sui­cides by hang­ing and the num­ber of lawyers in North Carolina, which as far as I can under­stand are not relat­ed at all. 

And these kinds of mis­takes can lead the fun­ny things here, like the Lego activ­i­ty book paired with American Jihad: the Terrorists Living Among Us.

And oth­er mis­takes can be when algo­rithms col­lide. In this won­der­ful exam­ple from Amazon, a reg­u­lar pedes­tri­an book was sell­ing for 1.7 mil­lion dol­lars because two sell­ers got into an algo­rith­mic bat­tle, unbe­knownst to each oth­er, where they had a for­mu­la of cal­cu­lat­ing their rates based on the oth­er that did­n’t stop until it was sev­en mil­lion dol­lars for the book. 

Or in this case, a shirt that actu­al­ly says keep calm and rape a lot” that nobody ever bought, nobody ever sold. It was algo­rith­mi­cal­ly gen­er­at­ed keep calm” and ran­dom words, all put up for sale. A mil­lion shirts on Amazon, wait­ing to see who might buy one, at which point it would be made.

These kinds of algo­rithms are going to be more and more com­mon, so when I think of some­thing like the European right to be for­got­ten, I see this even­tu­al­ly being all auto­mat­ed. The request to be for­got­ten will be auto­mat­i­cal­ly filed on your behalf, Google will think about it auto­mat­i­cal­ly, or Bing. And then they’ll make a deci­sion. So the whole line, from soup to nuts, will be done with­out any human intervention.

Now, of course search engines them­selves may be old-fashioned. It might be more like Siri, who’s just a concierge, and will tell you before you even know to ask, where you should go, what you might do. And think­ing about where that advice will come from, what ingre­di­ents are going in, is it always being giv­en in your best inter­est? Whether it’s advice or the tem­per­a­ture of your home, all very good ques­tions to ask. 

It real­ly calls to mind the Harvard/Yale game of 2004, in which mem­bers of the Harvard pep squad went up and down the aisles on the Harvard side of the field dis­trib­ut­ing col­ored plac­ards to hold up like a North Korean ral­ly at just the right moment, to say some­thing to the Yalies on the oth­er side.

Photo of the audience on one side of a football field, holding up placards spelling out "We suck."

It turns out that was the Yale pep squad in dis­guise who dis­trib­uted the plac­ards. So, when they held them up it spelled a rather unusu­al mes­sage, which is a great exam­ple of a minia­ture poli­ty being cre­at­ed with a mes­sage so big and pow­er­ful they can­not read it or under­stand that they’re part of it.

But the key word there is we.” We is usu­al­ly what we think of with a state. It’s meant to rep­re­sent us. But rep­re­sent­ing us by dint of geo­graph­i­cal hap­pen­stance, with the sov­er­eign imposed over it, that’s not what the cyber piece is bring­ing us. What it’s bring­ing us instead are lots of dif­fer­ent we’s, medi­at­ed in lots of ways, through lots of inter­me­di­aries, wor­thy very much of study, reflec­tion, and review.

Thank you very much.