Archive (Page 1 of 3)

The Tyranny of Algorithms and the Use of Predictive Policing by Israel

We have been doc­u­ment­ing and research­ing into human rights or dig­i­tal rights vio­la­tions that are tak­ing place in Palestine and Israel. And one of the most recent case stud­ies or work that we’re look­ing into is the use of pre­dic­tive polic­ing by Israel, which is rather a sen­si­tive issue giv­en that there isn’t a lot that we know about the sub­ject.

Virtual Futures Salon: Dawn of the New Everything, with Jaron Lanier

So here’s what hap­pened. If you tell peo­ple you’re going to have this super‐open, absolute­ly non‐commercial, money‐free thing, but it has to sur­vive in this envi­ron­ment that’s based on mon­ey, where it has to make mon­ey, how does any­body square that cir­cle? How does any­body do any­thing? And so com­pa­nies like Google that came along, in my view were backed into a cor­ner. There was exact­ly one busi­ness plan avail­able to them, which was adver­tis­ing.

Loving Out Loud in a Time of Hate Speech

Dangerous speech, as opposed hate speech, is defined basi­cal­ly as speech that seeks to incite vio­lence against peo­ple. And that’s the kind of speech that I’m real­ly con­cerned about right now. That’s what we’re see­ing on the rise in the United States, in Europe, and else­where.

Sleepwalking into Surveillant Capitalism, Sliding into Authoritarianism

We have increas­ing­ly smart, sur­veil­lant per­sua­sion archi­tec­tures. Architectures aimed at per­suad­ing us to do some­thing. At the moment it’s click­ing on an ad. And that seems like a waste. We’re just click­ing on an ad. You know. It’s kind of a waste of our ener­gy. But increas­ing­ly it is going to be per­suad­ing us to sup­port some­thing, to think of some­thing, to imag­ine some­thing.

AI and Ethical Design

I teach my stu­dents that design is ongo­ing risky decision‐making. And what I mean by ongo­ing is that you nev­er real­ly get to stop ques­tion­ing the assump­tions that you’re mak­ing and that are under­ly­ing what it is that you’re creating—those fun­da­men­tal premis­es.

The Algorithmic Spiral of Silence

A cou­ple of major plat­forms like Facebook and Twitter, YouTube, have become in many places around the world a de fac­to pub­lic sphere. Especially in coun­tries that have less than free Internet, less than free mass media. And these coun­tries have tran­si­tioned from a very con­trolled pub­lic sphere to a commercially‐run one like Facebook.

Malia Lazu, Black Reality 2.0: Creating and Making in the Digital Age

I became tired of knock­ing on the same doors and either see­ing the same peo­ple or dif­fer­ent peo­ple. But I real­ly just felt like I was in this cycle of faux lib­er­a­tion, where I would feel a vic­to­ry, and the vic­to­ry was prob­a­bly formed around the RFP for the grant that we need­ed to get in order to do our work.

Online Platforms as Human Rights Arbiters

What does it mean for human rights pro­tec­tion that we have large cor­po­rate interests—the Googles, the Facebooks of our time—that con­trol and gov­ern a large part of the online infra­struc­ture?

Behind the Screen: The People and Politics of Commercial Content Moderation

When I asked my peers and my pro­fes­sors if they’d ever heard of this type of work, two things hap­pened. The first thing is that they said no, they hadn’t. The sec­ond thing they said, which is prob­a­bly what you’re think­ing, is, Well, can’t com­put­ers do that?” And in fact the answer to that is no.

Automation and Algorithms in the Digital Age

I want to think more broad­ly about the future of cyber state, and think about accu­mu­la­tions of pow­er both cen­tral­ized and dis­trib­uted that might require trans­paren­cy in bound­aries we wouldn’t be used to.

Page 1 of 3