Trevor Paglen: Well, thank you guys so much for being here and thank you for the World Economic Forum for invit­ing me. I’m Trevor Paglen and I do lots of pre­pos­ter­ous things. Really. I’m an artist, and you know, one of the things I real­ly want out of art, what I see the job of the artist to be is to try to learn how to see the his­tor­i­cal moment that you find your­self liv­ing in, right.

I mean that very sim­ply and I mean it very lit­er­al­ly. How do you see the world around you? And this is hard­er to do than it might seem in many times. The world around us is a com­pli­cat­ed place. There’s all kinds of struc­tures and forms of pow­er that are very much a part of our every­day lives that we rarely notice. 

And one of the things I’ve been work­ing on for fif­teen years or so is look­ing at the world of sens­ing. Looking at the world of…you know, look­ing at the kind of planetary-scale struc­tures that we’ve been build­ing that facil­i­tate telecom­mu­ni­ca­tions, but at the same time are also instru­ments of mass sur­veil­lance. It’s some­thing I’ve tak­en a real­ly close look at over the years.

When we talk about sur­veil­lance I think a lot of us have the idea oh, there’s the secu­ri­ty cam­eras and then there’s some­body stand­ing behind all the mon­i­tors and look­ing and see­ing what’s going on. That image is over. It does­n’t work any­more. Right now the cam­eras them­selves are doing the oper­a­tions. In oth­er words, you have a traf­fic cam­era; that cam­era can detect if some­body is doing some­thing wrong and auto­mat­i­cal­ly issue a tick­et. So we’re build­ing these autonomous sur­veil­lance sys­tems that actu­al­ly inter­vene in the world. 

And a lot of peo­ple are say­ing that like by 2020 there’ll be a tril­lion sen­sors on the sur­face of the Earth that are able that are able to do this kind of thing. So this is some­thing that’s very much trans­form­ing not only the sur­face of the Earth but our every­day lives as well.

When we look at what these plan­e­tary infra­struc­ture look like, on the left we see an image of what Google’s glob­al infra­struc­ture looks like; on the right we see the National Security Agency’s glob­al infra­struc­ture as of 2013. The point is these are lit­er­al­ly tech­no­log­i­cal sys­tem that envel­op the Earth.

So one project that I’ve been doing is just try­ing to go to the places where these sys­tems come togeth­er. Where does this infra­struc­ture kind of con­geal in very spe­cif­ic places? A real­ly impor­tant part of glob­al telecom­mu­ni­ca­tions is choke points, places where transcon­ti­nen­tal fiber optic cables come togeth­er. What are the places where the con­ti­nents are con­nect­ed to one anoth­er? These are real­ly impor­tant to telecom­mu­ni­ca­tions but also obvi­ous­ly very impor­tant to surveillance—you sit on these places, you can col­lect most of the data that’s going through the Earth’s telecom­mu­ni­ca­tions systems.

What do these look like? Well this is a place in Long Island, one of these sites. One in north­ern California at Point Arena. The west coast of Hawaii. Guam is real­ly impor­tant to this kind of thing. Marseille, in France. And what do you see in the image? Nothing, right. The point of these images is these are some of the most sur­veilled places on Earth. These are lit­er­al­ly kind of like, core parts of glob­al telecom­mu­ni­ca­tions and sur­veil­lance infrastructure—there’s no evi­dence what­so­ev­er that that’s going on in the pho­tos of these kind of places. So what does that tell us kind of alle­gor­i­cal­ly about how some of these infra­struc­tures and sys­tems work? 

I did start push­ing this a lit­tle bit fur­ther. I want­ed to say well, the­o­ret­i­cal­ly there should be these con­junc­tions of cables in these bod­ies of water in these images. And so I learned how to scu­ba dive in a swim­ming pool in sub­ur­ban Berlin, as you do, and start­ed going out and study­ing nau­ti­cal charts and under­sea maps to try to find places on the con­ti­nen­tal shelf where I could maybe see these.

Going out with teams of divers, when you do every­thing right you find images like this. As you can see, there’s dozens and dozens of Internet cables mov­ing across the floor of the ocean. 

These are cables that con­nect the East Coast of the United States to Europe. 

Now, when we’re talk­ing about plan­e­tary sur­veil­lance sys­tems, AKA plan­e­tary telecom­mu­ni­ca­tions sys­tems, they’re not only envelop­ing the Earth like in a series of cables and hard­ware and infra­struc­ture. They’re also in the skies above our heads. Every minute of every hour there are hun­dreds and hun­dreds of satel­lites over our heads. One project I’ve been doing, again, over many many years, is try­ing to track and pho­to­graph all of the secret satel­lites in orbit around the plan­et, all the unac­knowl­edged satellites. 

This is done using data from ama­teur astronomers. Amateur astronomers go out, they see some­thing in the sky, they look it up in a cat­a­log, it’s not there, they know they’ve seen a secret satel­lite. Usually an American mil­i­tary or intel­li­gence satel­lite. They write down what they saw. What I can do is I can take that obser­va­tion, mod­el the orbit, make a pre­dic­tion about where some­thing will be, and then using tele­scopes and kind of computer-guided mounts, I can pin­point the place in the sky where I think it’ll be. And if you do every­thing right, which is rare, you get an image like this.

An this line here is the streak of some­thing called the X‑37B, for exam­ple. This is an American secret space drone that’s cur­rent­ly on its fourth mis­sion. The X‑37B.

So I get into the cul­ture a lit­tle in these things. This is from the crew patch of the guys that fly this thing. And this is the pro­gram office that con­trols it, an out­fit called the Rapid Capabilities Office, who have this mot­to here in Latin, Opus Dei blah blah blah, doing God’s work with oth­er peo­ple’s mon­ey.” So this is kind of a glimpse into the cul­ture of this kind of stuff.

So the point is like, we have sur­veil­lance sys­tems that exist at the scale of the plan­et, that lit­er­al­ly envel­op the sur­face of the Earth and lit­er­al­ly envel­op the heav­ens above the Earth. But these scale down in var­i­ous ways. These are also artic­u­lat­ed of course at the scales of cities, down to the scales of liv­ing rooms, down to the scales of our bod­ies, down to the scales even of our thoughts and the ques­tions that we ask.

One of the set of tools that I’ve been devel­op­ing in the stu­dio for the last cou­ple years, a set of tools that allow us to make images that show us what a com­put­er vision sys­tem might see as it looks in the world, what a neur­al net­work might see as it looks to the world. In oth­er words tools that make us images show­ing us what autonomous sens­ing sys­tems are actu­al­ly per­ceiv­ing when they look out at the world.

This for exam­ple is an image of the US/Mexico bor­der. And for those of you that don’t know this already there is a wall already. And what we’re see­ing is the bor­der and over­laid on top of the bor­der, we’re see­ing a vision of what the bor­der looks like as seen through com­put­er vision sys­tems that are used to detect motion, detect anom­alies. You know, we’re see­ing the bor­der as well through these sys­tems that sur­veil it.

A phe­nom­e­non that’s been going on for a while is ALPR, auto­mat­ic license plate track­ing. These are sys­tems that take pic­tures of every sin­gle car that dri­ves by on on a city street, is able to autonomous­ly read the num­ber of that car, and either put that in a data­base that the police or law enforce­ment have access to or again, issue things like traf­fic cita­tions based on that. All with­out human intervention.

The same thing is start­ing to come online with police body cam­eras, which are now being out­fit­ted with facial recog­ni­tion tech­nol­o­gy to do some­thing similar.

One of the tools that we have in the stu­dio is the abil­i­ty to make por­traits of what peo­ple look like as they are seen by facial recog­ni­tion soft­ware. And we’ve been run­ning these on por­traits of rev­o­lu­tion­ar­ies and philoso­phers from the past. The left is the great post-colonial philoso­pher Frantz Fanon as seen through facial recog­ni­tion soft­ware. On the right is Simone Weil.

This is also obvi­ous­ly hap­pen­ing in the com­mer­cial space. You go to a mod­ern super­mar­ket, there are autonomous sys­tems iden­ti­fy­ing you, try­ing to under­stand when the last time you were there was, how much mon­ey you spent. What are you look­ing at? What’s your emo­tion­al state? What’re you inter­est­ed in?

And they’re get­ting more and more inti­mate. Sensing sys­tems look­ing at what kind of food do we eat? Are we going to the gym? You know, are we in good health? How are we behav­ing? Are we drink­ing too much? Do we smoke cig­a­rettes? What kind of objects are in our hous­es and what does that say about who we are?

We’re at a point now where Google or Facebook or Amazon lit­er­al­ly knows more about me and my his­to­ry than I know about myself. And what are some of the impli­ca­tions of that? How do we think through that? What does that mean? What do we see when we actu­al­ly look through these kinds of sens­ing systems?

Well one thing that I think becomes very obvi­ous when you spend some time with it is I think there’s a kind of pop­u­lar idea out there that oh, tech­nol­o­gy is neu­tral, it’s just how you use it. And I want to counter that. I want to say there’s no such thing as tech­nol­o­gy detached from how you use it. And so when you use it, and when you deploy these kinds of sys­tems, any kind of sens­ing tech­nol­o­gy sees through the eyes of the forms of pow­er that it’s designed to ampli­fy. The forms of pow­er that it’s designed to exer­cise, whether that is mil­i­tary pow­er, or law enforce­ment, or com­mer­cial pow­er, etc. I think that’s one thing that you start to see. 

And this brings up a lot of con­cerns for me. I wor­ry about what the future of these kind of plan­e­tary autonomous sens­ing sys­tems are. I wor­ry that they have a ten­den­cy to kind of repro­duce the kinds of racism, and patri­archy, and inequal­i­ty that’ve char­ac­ter­ized so much of human his­to­ry. I’m also con­cerned that they rep­re­sent enor­mous con­cen­tra­tions of pow­er in very few places.

I said at the begin­ning of the talk that one of the things I want out of art is things that help us see the his­tor­i­cal moment that we live in. How do we learn to see the world? But there’s some­thing else that I want out of art as well. I want some­thing that helps us see a world that we want to live in. And if you want to see that world you have to ask your­self what do you want? And so I spend a lot of time look­ing at these tech­nolo­gies and ask­ing myself how would I want them to be dif­fer­ent? What world do I want to live in?

I want to live in a world in which arti­fi­cial intel­li­gence has become decou­pled from its mil­i­tary his­to­ries, from its com­mer­cial his­to­ries, from its law enforce­ment appli­ca­tions and so on and so forth. So I start­ed build­ing very irra­tional neur­al net­works. Neural net­works that instead of being able to ana­lyze your face and detect­ing what emo­tion you might be in, that look around at the world and see lit­er­a­ture. In this case this is an AI that sees omens and por­tents, and this is an image it syn­the­sized of a comet, or a rainbow.

This is an image that it syn­the­sized of a vam­pire. This was an AI that was trained to see mon­sters that have his­tor­i­cal­ly been alle­gories for capitalism.

Histories of war­fare or fan­tas­tic images.

I want an Internet…I want glob­al telecom­mu­ni­ca­tions sys­tems that are not the great­est instru­ments of mass sur­veil­lance in the his­to­ry of the world. What would that look like? So I start­ed build­ing com­mu­ni­ca­tions hard­ware for muse­ums in the form of sculp­tures that cre­ate open WiFi net­works. But instead of track­ing all the data of all the peo­ple that con­nect to them, it does the exact oppo­site. It anonymizes the iden­ti­ties of peo­ple con­nect­ing to the net­work and encrypts all the traf­fic so that nobody can see what peo­ple are doing when they’re using the Internet in a museum.

I want space flight that’s decou­pled from the his­to­ries of nuclear war and the mil­i­tary his­to­ries asso­ci­at­ed with it, even the kind of colo­nial imag­i­na­tions asso­ci­at­ed with it. So I start­ed build­ing a satel­lite. This is a project called Orbital Reflector that’s due to launch this sum­mer between July and August. We just got our offi­cial launch let­ter yes­ter­day. This was com­mis­sioned by the Nevada Museum of Art. And all it is is a satel­lite goes up in space, what it does it deploys a giant mir­ror about thir­ty meters long, a hun­dred feet. What that mir­ror does is reflect sun­light down to Earth to cre­ate a new star in the sky. That lasts about two months and then it burns up harmlessly.

And so this ques­tion of what kind of world do we want to live in. Well, I want to live in a world that has more jus­tice, that has more equal­i­ty, and that has more beau­ty in it. And that is kind of why I try to do these pre­pos­ter­ous things. Thank you guys so much for coming.


Nicholas Thompson: So I will start with one thing that struck me while I was watch­ing. When do you strive for beau­ty in your art, and when do you strive for banality?

Trevor Paglen: Yeah. That’s a real­ly good point, that’s a real­ly good point. You know, and when I’m mak­ing art, this is… I almost want to find out what some­thing else wants to look like. And I know that sounds real­ly mys­ti­cal but usu­al­ly the process that I have is I look at some­thing and then I look at it again, I look at it again, I look at it again. And I wait for it and a kind of aes­thet­ic lan­guage to emerge, like almost where it finds a form that it almost wants to be. 

And I and I don’t wor­ry about that con­tra­dic­tion between beau­ty or you know… You know, peo­ple crit­i­cize me some­times, say­ing, Trevor, you take beau­ti­ful pic­tures of bad things,” right. And my counter to that is you know, I’m not sure that that’s how beau­ty works. I mean, I think we would all love to live in a world in which beau­ti­ful things were good and ugly things were bad. But I can tell you if you go out with a tele­scope and you look at the night sky and you’re look­ing at the won­der of the cos­mos, and then you see the tiny glint of a spy satel­lite fly­ing through it, you will see noth­ing more beau­ti­ful in your life. Even though the pol­i­tics of that are some­thing we could have a debate about. And so I guess those kind of con­tra­dic­tions, I guess, I’m to inhabit.

Thompson: You stare at the cable or you stare at the sur­veil­lance sta­tion and then you think, Does this thing want to be beau­ti­ful? Does this thing want to be…?”

Paglen: It’s not that con­scious, you know. You just work with the mate­ri­als until you find some­thing that speaks to you.

Thompson: Right. Okay. We have some excel­lent ques­tions here. First one from Allison Martin. When can sur­veil­lance be good for us? Crisis response, for instance. How do we bal­ance the good with the less good?”

Paglen: Yeah…

Thompson: So if there’s a dis­as­ter, we want to know where the build­ing fell and find the people.

Paglen: No, absolute­ly. No, absolute­ly. I mean, you know, so I think of course— I mean, I think this is exact­ly the ques­tion I’m try­ing to ask, right, in the sense of we have these tech­nolo­gies. Technologies I will insist kind of ampli­fy the forms of pow­er in which—that they’re deployed to do. So what places in soci­ety do we want to opti­mize? What places do we not want to opti­mize? I would argue we want to opti­mize ener­gy effi­cien­cy, for exam­ple. That seems nobody wants argue with that, that’ll help the plan­et. We can do that.

Do we want to opti­mize for…extracting mon­ey out of peo­ple? If you do that, then you’re going to pro­duce a sys­tem that does that. And so I think for me that’s the big kind of question.

Thompson: I believe some of those com­pa­nies might be here in Davos.

Paglen: Yeah, that’s the whole entire premise—

Thompson: It’s very suc­cess­ful business.

Paglen: Yeah, no. I mean, that’s the premise of soft­ware plat­forms. That’s the premise of a Google or a Facebook.

Thompson: But let’s let’s hang with the ques­tion a lit­tle bit longer. As you think about sur­veil­lance tech­nol­o­gy devel­op­ing, how would you want to push it in a way that it ends up being used more for good caus­es than for caus­es you see as bad?

Paglen: Yeah. You know, I think—

Thompson: How do push tech— I mean, one way to push tech­nol­o­gy is to show art of it and have us think about it and have conversations.

Paglen: Of course.

Thompson: But how else do you try to push technology?

Paglen: You know, I think it real­ly does come down to this ques­tion of what world do I want to live in, right? And when I think about like the Internet, for exam­ple. I love the Internet. I use the Internet all the time. I want an Internet on one hand that’s like a library. The library has two impor­tant parts of it. They’re very impor­tant insti­tu­tions to demo­c­ra­t­ic soci­eties for two rea­sons. One, you can look up any­thing that you want and get any kind of infor­ma­tion you want. Two, equal­ly impor­tant, the police don’t get a record of the books that you checked out, right. And that’s not…you know, Google does­n’t get a record of…whoever.

And so I think about like those are kinds of insti­tu­tions that I want. And so I guess I think the col­lec­tive thing that we should all be doing as we are build­ing this tech­nos­phere or what­ev­er it is, is ask­ing our­selves what do we want the world to look like? And where do we want to apply these kinds of tech­nolo­gies and sens­ing sys­tems. And what kinds of places do we want to exempt from them? What are places in soci­ety that we actu­al­ly don’t want to optimize?

Thompson: This is a ques­tion that ties to that. We have a ques­tion now that came in through that feed from Zimbabwe. Don’t you think that sur­veil­lance actu­al­ly assists the way of life for us, although it incon­ve­niences a few?” And there is an argument—we were talk­ing about this before, that actu­al­ly get­ting lots of data on lots of peo­ple helps bring peo­ple into the finan­cial sys­tem. It helps lots of peo­ple, par­tic­u­lar­ly in devel­op­ing coun­tries. Is there…is sur­veil­lance some­thing we wor­ry about too much in the West, but actu­al­ly it’s ben­e­fi­cial for the world as a whole?

Paglen: Well, let’s think about what the impli­ca­tions of that are, right. So, today the most obvi­ous use of data col­lec­tion is like, sell­ing us stuff. So maybe they want to sell you Crocs and sell me like, motor­cy­cle boots or what­ev­er it is. And what have you. And we can argue about whether we like that or not. But as busi­ness mod­els evolve, the point of any busi­ness, and you’d be neg­li­gent not to do that, would be to extract as much mon­ey out of the data that you’re col­lect­ing as pos­si­ble, in a kind of heav­i­ly cap­i­tal­ist sys­tem like the US.

And so what am I going to do with that data? Well, I’m going to fig­ure out what’s your lifestyle? Are you a healthy per­son or not? Then we’ll sell that infor­ma­tion to your health insur­ance com­pa­ny, and they might say, Oh, you drank a Coke this week. That’s gonna charge you and extra five bucks on your health insur­ance pre­mi­um.” And you can think about what the log­i­cal impli­ca­tions of that are. If you play that out, you are cre­at­ing a soci­ety that where­by inequal­i­ties are going to become much more acute, on one hand. And you’re going to cre­ate a soci­ety that ulti­mate­ly is a lot more con­formist, because there’ll be seri­ous con­se­quences for doing stuff like I did as a teenag­er. Like, you screw up and and do weird stuff. 

So you’re going to fun­da­men­tal­ly change what the cul­ture is as well. And I wor­ry that by not exempt­ing parts of every­day life from sur­veil­lance or data col­lec­tion, that the con­se­quences are not going to real­ly be a world that we all feel per­fect­ly free in, honestly.

Thompson: One of the details I love about Trevor’s life is that he lives in Berlin next to the old Stasi offices.

Paglen: It’s true.

Thompson: We have anoth­er ques­tion, this one from Anonymous. What is the one tech­nol­o­gy that wor­ries you the most?”

Paglen: The one tech­nol­o­gy that wor­ries me the most. I mean, I think right now it’s prob­a­bly like a com­bi­na­tion of arti­fi­cial intel­li­gence plus cap­i­tal­ism, or plus state pow­er. And what I mean by that is the abil­i­ty to do stuff with mas­sive data sets, right. So in the case of the US it’s what we talked about with cred­it or health insur­ance or these kinds of ways in which our every­day lives, and our de fac­to lib­er­ties, are actu­al­ly mod­u­lat­ed, and where­by com­pa­nies are incen­tivized to draw as much mon­ey out of us as pos­si­ble. Or you have a soci­ety like the the sys­tem China’s set­ting up now, this kind of social cred­it score sys­tem where the state becomes an insti­tu­tion that is using sim­i­lar tools in order to reg­u­late the kind of docil­i­ty or con­for­mi­ty of a population.

Thompson: Is that was that wor­ries you the most? The sort of… The sys­tem in China, which, cov­er sto­ry in Wired mag­a­zine if any­body’s not famil­iar with it, last issue. PSA. Is that the…if you were to say of all the major projects in the world is it the sur­veil­lance sys­tem, the social cred­it score being built in China, the one that wor­ries you the most?

Paglen: I think to me the social cred­it score sys­tem is the clear­est dis­til­la­tion of what is hap­pen­ing in all sur­veil­lance plat­forms in gen­er­al. So for those of you that are not famil­iar, the Chinese social cred­it score sys­tem is a sys­tem that sur­veils every­thing that you do and gives you kind of good points if you’re kind of obe­di­ent and say nice things about the state and you pay your tax­es, you go to work on time, what­ev­er. A range—

Thompson: You lose five points if you inter­view Trevor Paglen on a stage…

Paglen: You lose social cred­it points—you know. And this has impli­ca­tions for what state ser­vices you’re able to access, what schools you’re able to go to, whether you can get a visa or not. You can get trav­el enhance­ments if you’re good, and you can get trav­el restric­tions if you’re bad. And so it’s a very obvi­ous way in which you can see a soci­ety that has a kind of a strong cen­tral­ized state want­i­ng to main­tain a pop­u­la­tion a cer­tain way these kinds of tools to do that.

Thompson: [indi­cat­ing his event badge]It’s kind of like this, except the col­or keeps chang­ing based on what you do here at the World Economic Forum. 

Let me ask you about how, as you talk about the move towards AI, how that affects your art. Because if your art is pho­tograph­ing spe­cif­ic things, that’s rel­a­tive­ly dis­creet, right. You can pho­to­graph a satel­lite, a sta­tion, a cable. How do you deal with this new world we’re going into where every­thing is going be inside of code? How will that affect your career as an artist?

Paglen: Yeah, it’s been real­ly inter­est­ing. The last cou­ple of years I’ve been work­ing a lot with AI and com­put­er vision, and try­ing to— It’s almost like going inside the cables them­selves and try­ing to see what world lives in there. And obvi­ous­ly it’s a world that has no ana­logue to our lived expe­ri­ence, our every­day life. Even to the point where you can’t real­ly under­stand what’s going on in a neur­al net­work if you’re build­ing and you’re a com­put­er sci­en­tist. I mean, they’re kind of famous­ly inscrutable to humans.

So, I think the way that I have been try­ing to deal with that ques­tion in par­tic­u­lar is you almost have to make almost a lit­er­ary turn. You know, find­ing metaphors, find­ing alle­gories. That’s kind of what I was doing with the AI, where I was train­ing it to see images from Freud or Dante, or tax­onomies of things that I would make up. And you try to find allegories.

Thompson: That’s a way of see­ing a new kind of AI. How do you explain how the inner AI of some of the sur­veil­lance com­pa­nies, how do you explain how those work through art?

Paglen: I’m not sure that that’s real­ly what my job is as an artist, you know. I real­ly think that my job isn’t to explain to you so much as it is to cre­ate a series of images that you can use as kind of a vocab­u­lary, a kind of cul­tur­al vocab­u­lary, with which to try to have con­ver­sa­tion. So I think it’s an adja­cent project to some­thing like jour­nal­ism or crit­i­cism or some­thing like that. And in order to under­stand the world, we need lan­guage, we have log­ic, we make argu­ments on one hand, we have evi­dence and data. But also we expe­ri­ence the world through images, through cul­ture, through music. And we kind of put all these togeth­er to form a worldview.

Thompson: And so why and how did you make this your job? How did you decide that explain­ing sur­veil­lance to the peo­ple through art? Because it’s not like… You prob­a­bly did­n’t study that in college.

Paglen: Well, I did study that in col­lege. I have degrees in art and I have a degree in geog­ra­phy as well. And I think that doing this kind of work is…you just do it or you don’t. I mean, I real­ly feel like in terms of doing art, you either do it or you don’t, you know. 

Thompson: But you start with one project on sur­veil­lance. When did you decide that was going to be the thing that you would devote your life to?

Paglen: Yeah, I mean one project always turns into anoth­er one. So in the aughts I was doing a lot of work around the CIA and you know the war on ter­ror and try­ing to under­stand that visu­al land­scape of that as well as what’s the rela­tion­ship between secre­cy, invis­i­bil­i­ty in that. And then as a result of that work I was brought on to do some work around the Edward Snowden project and the Citizenfour film, which turns into a project about NSA and glob­al infra­struc­ture. Then at some point you look back and say okay, this is what the NSA looks like but wait a minute. There’s this big­ger thing called Google, what’s going on there? And so I think just one project always turns into another.

Thompson: So we have anoth­er ques­tion that’s been upvot­ed. What kind of tech­nol­o­gy do you use or avoid using as a pri­vate individual?”

Paglen: No, it’s a real­ly good ques­tion. So I use the same technology—I’m on Twitter. I’ve got the smart­phone. There are… A lot of us imag­ine, you know— One of my pet peeves is when peo­ple say, Well, we just give all our infor­ma­tion to Google,” or, We just give all our infor­ma­tion to Facebook. We just give all our infor­ma­tion to Apple by hav­ing smartphones.” 

These are not sys­tems that you can opt out of, right? I mean, in prac­ti­cal life, if you want to have a job, you have to have a smart­phone. If you want to stay con­nect­ed with your friends, you’ve got to have a Facebook account, etc. So I don’t that sub­mit­ting to these kinds of sys­tems or hav­ing these sys­tems be a part of your every­day life is some­thing that you can opt out of as eas­i­ly as— I don’t actu­al­ly think we have much choice in that. 

Having said that, if I’m doing some­thing like Googling ques­tions I might have about my own per­son­al health or about things like that, I’m going to use tools that encrypt that. So I’ll use Tor for exam­ple to do that. I’ll use Signal to com­mu­ni­cate with oth­er peo­ple. Just because I don’t nec­es­sar­i­ly want that kind of thing to be in my meta­da­ta sig­na­ture. So I think a lot about what that meta­da­ta sig­na­ture looks like, and what I’m not so wor­ried about, and things that I think down the line might have unin­tend­ed consequences.

Thompson Right. One of my favorite tools to delim­it that is soft­ware that will Google thou­sands and thou­sands of ran­dom phras­es just to con­fuse your meta­da­ta signal.

Here’s a point­ed ques­tion. Have you con­sid­ered using your work to do good? Solve a crime and democ­ra­tize access to tech­nol­o­gy. So it’s not only for those who can afford them.” So I think that’s sort of the soft­ware you’re cre­at­ing, the tools you’re creating.

Paglen: No, absolute­ly. So, I think in terms of doing good we always oper­ate at the scale that we’re able to oper­ate in and using the insti­tu­tions that we do. So I’m an artist. I work with muse­ums. That’s a place where I can make an inter­ven­tion. So the soft­ware that we use is all open source and we pub­lish the code and every­thing like that. But you oper­ate with­in your sphere of influ­ence, you know.

Thompson: I think you’re doing good. We only have about two min­utes so I want to ask you, you’re about to launch a satellite! 

Paglen: Yes.

Thompson: I think you might be the only guy in this room launch­ing a satel­lite and we kin­da just breezed past that. So please explain why you’re launch­ing a satel­lite, what the satel­lite is going to do, and when does it go.

Paglen: Yeah. So the satel­lite is a project com­mis­sioned by the Nevada Museum of Art. It’s con­ceived of as an Earthwork. And Earthworks are tra­di­tion­al­ly big art projects that are out in the mid­dle of the desert. You dri­ve out there, you see it, it’s a big art­work. And the Nevada Museum of Art, because there’s a lot of the desert, they com­mis­sion a lot of these things.

So this is com­mis­sioned as an Earthwork. It’s a satel­lite that’s a small satel­lite. It pig­gy­backs, actu­al­ly, on a spy satel­lite from Vandenberg Air Force Base in California. It goes into orbit and then once it’s in orbit in a 575 kilo­me­ter orbit, it opens up and it inflates a giant mir­ror, basi­cal­ly, in the shape of a dia­mond. About 100 feet long. And then as that mir­ror kind of goes through space it catch­es sun­light and reflects it down to Earth. And there’s a win­dow for a cou­ple of hours after twi­light and before dawn, if you look up in the sky you’ll see the sun­light reflect­ed off of this, and it will look like a star about as bright as one of the stars in the Big Dipper. And so there’ll be a web site and an app where you can say, I want to see this project,” and it’ll say, Oh, you’re in San Francisco. Go out at 8:15 tonight and look at this con­stel­la­tion, here’s the extra star.”

Thompson: So the point is just for us to be able to see it?

Paglen: That’s the entire point. So the point of this satel­lite was to build a satel­lite that as much as pos­si­ble had no mil­i­tary, sci­en­tif­ic, or com­mer­cial func­tion, right. So it was try­ing to build a satel­lite that as much as you pos­si­bly could was just an aes­thet­ic object, and what’s more to kind of mobi­lize the inge­nu­ity of peo­ple in the aero­space indus­try towards that goal. And so in that sense I think of it, it’s a pre­pos­ter­ous object, or an impos­si­ble object. And I’m try­ing to make an object that kind of con­tra­dicts the log­ic of every oth­er satel­lite that’s ever been launched.

Thompson: Do you wor­ry that you’re going to…contradict some of the won­der we have when we look up at the sky? I think about my kids look­ing up at the sky, and it’s like you see­ing stars but you’re also see­ing Trevor Paglen’s Earth project…

Paglen: Well, I think the bet is that hav­ing Trevor Paglen’s art project in the sky gives you an excuse to take your kids out there and look at the sky.

Thompson: There you go. That’s even bet­ter. Alright, thank you very much. I’ve got to—speaking of sur­veil­lance, go to three events host­ed by Facebook. Thank you Trevor Paglen. I think he’s awe­some. I think his work is amaz­ing. Thank you for being here. I very much enjoyed that con­ver­sa­tion. Thank you for all the questions.