Archive (Page 1 of 3)

The True Costs of Misinformation
Producing Moral and Technical Order in a Time of Pandemonium

Of course we’re avid, avid watchers of Tucker Carlson. But insofar as he’s like the shit filter, which is that if things make it as far as Tucker Carlson, then they probably have much more like…stuff that we can look at online. And so sometimes he’ll start talking about something and we don’t really understand where it came from and then when we go back online we can find that there’s quite a bit of discourse about “wouldn’t it be funny if people believed this about antifa.”

Platforming, Deplatforming & Replatforming
Following extremists around the Internet

Extremists around the world are increasingly being thrown off of social media. And so…the big question that I’m going to try to answer is, is this effective? Is it good? Is it good for the platforms? Who does it benefit? Is it good for the platforms, is it good for the extremists, is it good for the Internet, is it good for society at large?

The Tyranny of Algorithms and the Use of Predictive Policing by Israel

We have been documenting and researching into human rights or digital rights violations that are taking place in Palestine and Israel. And one of the most recent case studies or work that we’re looking into is the use of predictive policing by Israel, which is rather a sensitive issue given that there isn’t a lot that we know about the subject.

Virtual Futures Salon: Dawn of the New Everything, with Jaron Lanier

So here’s what happened. If you tell people you’re going to have this super-open, absolutely non-commercial, money-free thing, but it has to survive in this environment that’s based on money, where it has to make money, how does anybody square that circle? How does anybody do anything? And so companies like Google that came along, in my view were backed into a corner. There was exactly one business plan available to them, which was advertising.

Loving Out Loud in a Time of Hate Speech

Dangerous speech, as opposed hate speech, is defined basically as speech that seeks to incite violence against people. And that’s the kind of speech that I’m really concerned about right now. That’s what we’re seeing on the rise in the United States, in Europe, and elsewhere.

Sleepwalking into Surveillant Capitalism, Sliding into Authoritarianism

We have increasingly smart, surveillant persuasion architectures. Architectures aimed at persuading us to do something. At the moment it’s clicking on an ad. And that seems like a waste. We’re just clicking on an ad. You know. It’s kind of a waste of our energy. But increasingly it is going to be persuading us to support something, to think of something, to imagine something.

AI and Ethical Design

I teach my students that design is ongoing risky decision-making. And what I mean by ongoing is that you never really get to stop questioning the assumptions that you’re making and that are underlying what it is that you’re creating—those fundamental premises.

The Algorithmic Spiral of Silence

A couple of major platforms like Facebook and Twitter, YouTube, have become in many places around the world a de facto public sphere. Especially in countries that have less than free Internet, less than free mass media. And these countries have transitioned from a very controlled public sphere to a commercially-run one like Facebook.

Malia Lazu, Black Reality 2.0: Creating and Making in the Digital Age

I became tired of knocking on the same doors and either seeing the same people or different people. But I really just felt like I was in this cycle of faux liberation, where I would feel a victory, and the victory was probably formed around the RFP for the grant that we needed to get in order to do our work.

Online Platforms as Human Rights Arbiters

What does it mean for human rights protection that we have large corporate interests—the Googles, the Facebooks of our time—that control and govern a large part of the online infrastructure?

Page 1 of 3