First I just want to say thank you to Addie for bring­ing us all togeth­er for the launch of Deep Lab, and thank you to Golan and Lorrie [Cranor] and Linda and Marge for get­ting us here and host­ing us, keep­ing us feed and sup­port­ed while we work. That’s real­ly such a privilege.

Jen Lowe Deep Lab plan

This was my plan two days ago, and I’ll be doing some­thing like this. First I just want to give a quick heads up that I’m going to show a video lat­er with a lit­tle bit of vio­lence. It’s far less than you’d see on TV, but I don’t want any­body to be tak­en by sur­prise, so just fair warning.

Almost a year ago, I put my heart­beat online, and along with my heart­beat an account­ing of all the days I’ve lived, and the days I sta­tis­ti­cal­ly have yet to live, along with my aver­age heart­beat for each day. So I was play­ing with the idea of pri­va­cy. Here’s this very inti­mate mea­sure, in a way. But I’m not wor­ried about shar­ing it because there’s not much you can learn about me from my heart rate. 

Screenshot of One Human Heartbeat

Screenshot of One Human Heartbeat

But it turned out I don’t know much about heart rates, and you can see start­ing in mid-July that my heart rate goes up an stays up. So there’s this bright red ring around, and toward the top, the bright red gets much brighter and stays bright all the way until now. That’s because of my preg­nan­cy. Just for ref­er­ence, the first arrow is around con­cep­tion and then the next arrow is when I found out I was preg­nant. And you can see, maybe not here but in gen­er­al, that the bright red starts just before that spot. 

About two years ago I gave a talk at SXSW with Molly Steenson called The New Nature vs. Nurture and I intro­duced the Big Data Baby, which was the first baby to be pre­dict­ed by Big Data, by Target’s algo­rithms in par­tic­u­lar. And in fact the Big Data Baby was born to a teenage moth­er. And the baby was announced to her whole fam­i­ly, specif­i­cal­ly to the father of this teenage girl, by mar­ket­ing mate­ri­als that were sent by Target to the fam­i­ly’s address. Our faces are formed between two and three months, and so this Big Data Baby might’ve been iden­ti­fied by data before it’s face was even formed. 

Someone else that’s work­ing with data and preg­nan­cy in a more detailed way is Janet Vertesi, a Sociology pro­fes­sor at Princeton, who did a whole exper­i­men­tal project where she tried to hide her preg­nan­cy from Big Data, which most­ly meant hid­ing it from Facebook and Amazon. She often quotes that a preg­nant wom­an’s mar­ket­ing data is worth fif­teen times an aver­age per­son­’s data, so I’m very very valu­able to marketers.

But using Big Data to iden­ti­fy preg­nan­cy, I used to talk about that, but I did this project and now it seems to me that that could become anti­quat­ed. As more peo­ple opt into to pulse-tracking and oth­er forms of bio-tracking, com­pa­nies will be able to pre­dict preg­nan­cy after just a few weeks, mov­ing the pre­dic­tion date even fur­ther for­ward to before a woman even knows that she’s pregnant. 

Mike Seay Officemax address label

Big Data isn’t just in the busi­ness of births, it’s also track­ing deaths. Last January, OfficeMax sent Mike Seay this mar­ket­ing mail, a glitch that shows us he’s in the daugh­ter killed in a car crash” mar­ket­ing con­tain­er. So cor­po­ra­tions are using data to announce births and also using data acci­den­tal­ly to remind us of deaths.

I’m going to talk about two future data direc­tions that are par­tic­u­lar­ly dis­turb­ing to me right now. And the first is data col­o­niza­tion. This is Shamina Singh from the MasterCard Center for Inclusive Growth. She’s speak­ing at a data and soci­ety event, The Social, Cultural & Ethical Dimensions of Big Data.’ ” It’s impor­tant that you remem­ber that she’s rep­re­sent­ing MasterCard, and this is going to be about three min­utes with some interruptions.

[Transcript of por­tions played fol­lows video.]

…and the big issue for us is eco­nom­ic inequal­i­ty, inclu­sive growth, the gap between the rich and the poor, finan­cial inclusion[…]

[Jen Lowe] And here’s she’s about to tell us the data that MasterCard tracks for each transaction.

It takes your cred­it card num­ber, your date, your time, the pur­chase amount, and what you pur­chase. So right now there are about 10 petabyes, they tell me, liv­ing in MasterCard’s data ware­house. In the Center for Inclusive Growth, we’re think­ing about how do we take those ana­lyt­ics, how do we take all of that infor­ma­tion, and apply it in a way that address­es these seri­ous issues around inclu­sive growth? Around the world, and we all know what’s hap­pen­ing in Syria, we’ve all heard what’s hap­pen­ing in Somalia, refugees are trav­el­ing from their home coun­tries and going to live in refugee camps in safe countries. 

Have you ever thought about what it takes to move food and water and shel­ter from places like the United States? When every­body says we’re giv­ing aid to the Philippines, or we’re giv­ing aid to Syria, what does that actu­al­ly mean? It means they’re tak­ing water, they’re ship­ping water, they’re ship­ping rice, they’re ship­ping tents, they’re ship­ping all of these things from huge places around the world, usu­al­ly devel­oped coun­tries, into devel­oping coun­tries, using all of that ener­gy, all of that ship­ping, all the fuel, to take all of this stuff to these coun­tries. And what hap­pens? Usually the host coun­try is a lit­tle pissed off, because they’ve got a bunch of peo­ple that they can’t sup­port inside their coun­try, and they don’t have any means to sup­port them. So one of the things that we have been think­ing about is okay, how do we use our infor­ma­tion, how do we use our resources to solve that—to help at least address some of that? One of the answers? Digitize the food program. 

So instead of buy­ing the food from all of these coun­tries, why not give each refugee an elec­tron­ic way of pay­ing for their food, their shel­ter, their water, in a reg­u­lar gro­cery store in the home coun­try? But the out­come is that these peo­ple, these refugees, who’ve left every­thing they know are show­ing up in a new place and have the dig­ni­ty now to shop wher­ev­er MasterCard is accept­ed? Can you imagine?

[Jen Lowe] I just have to play that again.

…and have the dig­ni­ty now to shop wher­ev­er MasterCard is accept­ed? Can you imagine?

[…]

Again, work­ing with peo­ple at the bot­tom of the pyra­mid, those who have least, to make sure that we are clos­ing the gap between the rich­est and the poorest.
PLENARY — Social, Cultural & Ethical Dimensions of Big Data”

This is a world­wide plan that MasterCard has, and anoth­er world­wide plan comes from Facebook through inter​net​.org, the idea of using drones to deliv­er Internet to currently-unconnected places. And Google’s Project Loon is the Facebook plan but with bal­loons instead of drones: 

https://​www​.youtube​.com/​w​a​t​c​h​?​v​=​m​96​t​Y​p​E​k​1Ao

Sometimes every­one isn’t real­ly every­one like when peo­ple say every­one’s on the Internet because the truth is for each per­son can get online there are two that can’t and when you look clos­er, that every­one looks even less like anyone[…]
Introducing Project Loon

If you see a video with ani­mat­ed string and a lit­tle girl’s voice, that’s a sign that you might be ter­ri­fied. So, real­ly for me these three things come together—MasterCard, Facebook, Google—to rep­re­sent what might be the com­plete remote cor­po­rate col­o­niza­tion of the dark places on the map that’re being wrapped up like a present.

The sec­ond data direc­tion that’s dis­turb­ing and par­tic­u­lar­ly time­ly in the US has to do with three things I see (I’m sure there are more) going on togeth­er with polic­ing in the US. Predictive polic­ing, mil­i­ta­rized police, and also some­thing that’s very spe­cif­ic and new that I’m call­ing pro­tect­ed police.” Predictive polic­ing is actu­al­ly hap­pen­ing in Chicago. The Chicago Police Department was fund­ed two mil­lion dol­lars by a Department of Justice grant. They used an algo­rithm and data to cre­ate a heat list” of the 400 peo­ple that they pre­dict­ed to be most like­ly to be involved in a vio­lent crime. There’ve been Freedom of Information Act requests for that list, but those have been denied. And par­tic­u­lar­ly impor­tant I think is that six­ty of those 400 peo­ple have been per­son­al­ly vis­it­ed by the police. Basically just to say, You’re on this list and we’re keep­ing an eye on you.” So pre­crime is sort of real. And anoth­er thing that’s par­tic­u­lar­ly dis­turb­ing about that sto­ry is that the researcher involved uses this kind of lan­guage. He

assures The Verge that the CPD’s pre­dic­tive pro­gram isn’t tak­ing advan­tage of — or unfair­ly pro­fil­ing — any spe­cif­ic group. The nov­el­ty of our approach is that we are attempt­ing to eval­u­ate the risk of vio­lence in an unbi­ased, quan­ti­ta­tive way.”
The minor­i­ty report: Chicago’s new police com­put­er pre­dicts crimes, but is it racist?

Don’t ever trust unbi­ased” and quan­ti­ta­tive” put togeth­er. So that’s hap­pen­ing in Chicago, and the Illinois leg­is­la­ture just December 4th [2014] intro­duced a bill that basi­cal­ly makes it a felony to video­tape the police up to a sen­tence of two to four years in prison. So in Chicago if you add this togeth­er what it ends up mean­ing is that police can use all data avail­able to them to pre­dict cit­i­zens’ actions, but cit­i­zens can’t col­lect data on the police. 

So now that we’re total­ly depressed about data col­o­niza­tion and trends in data polic­ing, I’m going to use a lit­tle Nina Simone to revive us. In case you miss it, the inter­view­er asks, What’s free to you?”

[Transcript of por­tion played fol­lows video.]

–Well, what’s free to you?

–What’s free to me? Same thing as to you, you tell me.

–I don’t know, you tell me. Cuz I’ve been talk­ing for such a long—

–Just a feel­ing. It’s just a feel­ing. It’s like how do you tell some­body how it feels to be in love? How are you going to tell any­body who has not been in love how it feels to be in love, you can­not do it to save your life. You can describe things, but you can’t tell em. But you know it when it hap­pens. That’s what I mean by free. I’ve had a cou­ple of times on stage when I real­ly felt free, and that’s some­thing else. That’s real­ly some­thing else. Like all, all, like, like, I’ll tell you what free­dom is to me. No fear. I mean real­ly, no fear. If I could have that, half of my life, no fear. Lots of chil­dren have no fear. That’s the clos­est way, that’s the only way I can describe it. That’s not all of it. But it is some­thing to real­ly, real­ly feel.

–Have you… Like, I’ve noticed…

–Like a new way of see­ing. A new way of see­ing something. 

vlcsnap-2015-04-17-05h16m08s816

So if free­dom is no fear, it’s safe to say that we’re pret­ty far from that right now. The Pentagon’s 1033 pro­gram, which pro­vides trans­fers of sur­plus Department of Defense mil­i­tary equip­ment to state and local police with­out charge is cre­at­ing this police mil­i­ta­riza­tion. And dur­ing this week, Ingrid Burrington has been work­ing with the data from this 1033 pro­gram. This is a list of the most com­mon items pro­vid­ed free to police in the US, and the data’s avail­able online, if any­body feels like work­ing with it:

The New York Times has a nice visu­al­iza­tion of the loca­tion of var­i­ous give­aways from this 1033 pro­gram. Here you see the 205 grenade launch­ers that have been giv­en to police, and where those went: 

vlcsnap-2015-04-17-05h21m39s472

Grenade launch­ers are pri­mar­i­ly used for smoke grenades and tear gas. And tear gas is a real­ly per­son­al thing for me right now because it caus­es mis­car­riage. So there’s been all these protests in New York, and I’d love to go out and take to the streets but it’s not some­thing that I can do right now. And I feel very much because I know that if I went out with my friends they might be able to pro­tect me but there’s no pro­tec­tion from some­thing like tear gas. 

I think about this Turkish woman all the time when I think about protest. And I think about the Standing Man in Turkey who start­ed this protest by silent­ly star­ing at his own flag. 

And I’ve been sort of, not towards any pur­pose, but read­ing a lot about silence over the last year. This is from Sara Maitland’s Book of Silence.

We say that silence needs’—and there­fore is waiting—to be bro­ken: like a horse that must be bro­ken in’. But we are still fright­ened. And the impend­ing eco­log­i­cal dis­as­ter deep­ens our fear that one day the sci­ence will not work, the lan­guage will break down and the light will go out. We are ter­ri­fied of silence, so we encounter it as sel­dom as possible.
Sarah Maitland, A Book of Silence

Ernesto Pujol, "Sited Body, Public Visions" 2012

Ernesto Pujol, Sited Body, Public Visions, 2012

This is Ernesto Pujol’s work. He’s a per­for­mance artist who cre­ates site-specific works that use walk­ing, vul­ner­a­bil­i­ty, space, human ener­gy, and silence as his mate­ri­als. I’m going to read from his 2012 book.

The per­for­mance begins to reveal how time is an incred­i­bly elas­tic con­struct. As the per­form­ers stood still, walked, and ges­tured slow­ly for twelve hours, their silence became an enti­ty that height­ened not just their own but every­one else’s aware­ness, slow­ing down every­thing and every­one, fill­ing and sen­so­ri­al­ly expand­ing the dimen­sions of the room, it’s height, depth, and width. The silence emp­tied the room of noise, it reject­ed noise, fill­ing the room itself, as tan­gi­ble as liq­uid, as if the room was a water tank. Silence did not cre­ate a void. It had a tan­gi­ble body we could cut through with a knife. Our Chicago pub­lic, unex­pect­ed­ly found silence again. A silence that human beings need to sus­tain the human con­di­tion for lis­ten­ing, remem­ber­ing, reflect­ing, dis­cern­ing, decid­ing, heal­ing, and evolv­ing. Silence is a human right.
Ernesto Pujol, Sited Body, Public Visions

This came out super-recently and it’s an exam­ple of silence stop­ping a fight on a New York sub­way. Watch for the snack guy:

So this guy comes in and with­out a word or even like a ges­ture or an expres­sion, he’s just silent­ly eat­ing his Pringles but suc­cess­ful­ly break­ing up the phys­i­cal vio­lence of this fight that’s going on.

So we’re here this week work­ing on projects relat­ed to pri­va­cy and anonymi­ty and the deep web, and I noticed that this weird thing hap­pens when I talked to peo­ple about com­ing here. It’s that when I men­tion that I’m doing some­thing about the deep web, peo­ple I know, their imme­di­ate response is like, Isn’t that just a thing that’s for child porn and human traf­fick­ing?” And I also hap­pen to be from Arizona. And I’ve noticed it’s almost the same sort of response. When I say I’m from Arizona peo­ple imme­di­ate­ly say some­thing about how racist Arizona is. And so these seem to me that they’ve become these sort of shad­ow spaces to be avoided.

And to be clear, Arizona has a lot of issues. In 2011 there was a mass shoot­ing where eigh­teen peo­ple were shot and six peo­ple died. In 2002 three nurs­ing pro­fes­sors were killed on the U of A cam­pus in Tuscon, actu­al­ly while I was teach­ing. The Oklahoma City bomber devel­oped and test­ed his bombs while liv­ing in Arizona. Two of the 911 high­jack­ers took flight lessons in Arizona. It took Arizona vot­ers three tries to pass Martin Luther King Day as a hol­i­day, from 86 to 1992 when I was a kid. The state lost hun­dreds of mil­lions of dol­lars to boy­cotts. Stevie Wonder refused to per­form, the Superbowl pulled out. I remem­ber from being a kid that the defense at the time was mon­ey, that the state could­n’t afford anoth­er state hol­i­day. So when the hol­i­day final­ly passed and they had to have it, they took away Columbus Day, which was a love­ly sort of bal­ance. And this is just the video for Public Enemy’s By the Time I Get to Arizona” which sort of chron­i­cled that fight to get Martin Luther King Day in Arizona.

Arizona also has a long his­to­ry of racist polic­ing, espe­cial­ly when it comes to bor­der issues. Teju Cole has this very excel­lent Twitter essay about the Arizona/Mexico bor­der. In 2010 Arizona passed a bill that requires immi­grants to car­ry proof of their legal sta­tus and also requires police to deter­mine a per­son­’s immi­gra­tion sta­tus if they have a rea­son­able sus­pi­cion that that per­son is an immi­grant. So that law basi­cal­ly demands racial pro­fil­ing from the police.

But oth­er things grow in these shad­ow spaces like Arizona. The Sanctuary Movement start­ed at the Southside Presbyterian Church in Tuscon, Arizona in 1980. And it pro­vid­ed hous­ing and oth­er sup­port for refugees from Central America, espe­cial­ly El Salvador and Guatemala. It expand­ed to include over 500 con­gre­ga­tions nation­wide. I don’t think it’s well-known but it real­ly was the Underground Railroad of its time. A friend of mine in grade school had a Guatemalan fam­i­ly secret­ly liv­ing in his house, which he did­n’t tell me until we were in college. 

There aren’t many images of this because pro­vid­ing refuge and sanc­tu­ary are by neces­si­ty qui­et acts. In research­ing this talk I found that there are new Sanctuary move­ments pop­ping up again all over the US, again pro­vid­ing refuge to immi­grants. No More Deaths and Humane Borders are two Southern Arizona groups that pri­mar­i­ly care for immi­grants by the sim­ple, qui­et act of putting water in the desert. Members of both groups have been pros­e­cut­ed for leav­ing water and dri­ving immi­grants that are hurt to the hos­pi­tal. So car­ry­ing water into the desert and dri­ving dying peo­ple to the hos­pi­tal become dan­ger­ous acts.

The Dark Web and anonymi­ty on the Internet are pop­u­lar­ly seen as shad­ow, crim­i­nal spaces. People say, Why do you need to be anony­mous? You have noth­ing to wor­ry about if you have noth­ing to hide.” But I learned that in Arizona, that in the shad­ows and in anonymi­ty there is pow­er, that the Wild West is a good place for mak­ing silent but rev­o­lu­tion­ary change.

Still from 15 Million Merits

Still from Black Mirror, 15 Million Merits

This is an image from the sec­ond episode of Black Mirror, which is not me encour­ag­ing you to watch Black Mirror, nec­es­sar­i­ly. It’s…you’ll need a ther­a­pist if you are going to watch the show. But in this par­tic­u­lar episode Bing, who’s the char­ac­ter pic­tured, ends up being rich­ly paid to per­form the voice of dis­sent. Two years ago I was doing this insane five-day dri­ve across the coun­try, and it was just me and my dog and I was mov­ing from California to New York. And I stopped the day after Christmas in Albuquerque to meet with my friends Shaw and Rachel. Rachel and I had just spo­ken togeth­er at this Big Data mar­ket­ing con­fer­ence, and we were very much play­ing to role of the voice of dis­sent. So we were talk­ing ethics when we were sup­posed to be talk­ing mon­ey. Everyone else was talk­ing mon­ey. And I said to them, I feel like when we show up and we give these dis­sent­ing talks from mon­ey (we get paid for giv­ing these) we’re like hold­ing the glass to our throats.” I feel just like this Black Mirror episode. It’s a per­for­mance. It makes pow­er feel like they’ve earned their badge for lis­ten­ing to our dis­sent. But what does it change? And they asked, Jen that’s nice, but what do you want to do instead?” And at the time, I said this:

I’m think­ing about how to become more dangerous.

This seems real­ly com­pli­cat­ed. It seems like how do I become more dan­ger­ous com­pared to this world where these giant cor­po­ra­tions are using Big Data, and now I feel this sort of, how can I become more dan­ger­ous when I can’t even go out in the streets and protest, because I don’t want to be around tear gas? But I increas­ing­ly think that, for me, and maybe also for you, becom­ing more dan­ger­ous can mean find­ing a qui­et ges­ture that helps us to cre­ate less fear and more freedom. 

Erdem Gündüz

Erdem Gündüz

And this works in all spaces, offline and online. But online spaces strike me as these par­tic­u­lar­ly rich spaces for these sort of invis­i­ble ges­tures. And some of the work that’s being done this week, Harlo’s Foxy Doxing project which pro­vides more infor­ma­tion and con­text about abuse online, Maddy just described her Safe Selfies project which gives indi­vid­u­als this aggres­sive pow­er to pro­tect their own files. Also I can think of when indi­vid­u­als just fill in OpenStreetMap data in areas of human­i­tar­i­an cri­sis or nat­ur­al dis­as­ter to help aid work­ers find their way. And relat­ed to why we’re here and the future of this grant right now, Julian Oliver and Danja Vasiliev’s work teach­ing peo­ple how to use Tor and oth­er tools for anonymity. 

So as we go for­ward as a col­lec­tive I think about how we might become more dan­ger­ous. For me becom­ing more dan­ger­ous means answer­ing: how might I offer sanc­tu­ary in the midst of uncon­trolled mon­i­tor­ing of our behav­ior online, of the com­mod­i­fi­ca­tion of con­scious­ness; in the midst of data col­o­niza­tion and pre­dic­tive polic­ing? How can I build refuge? And becom­ing more dan­ger­ous means answer­ing in the midst of all these hos­tile off-line and on-lie envi­ron­ments: how can I car­ry more water? Thank you.