Archive (Page 2 of 3)

The Conversation #45 — James Bamford

You’re not going to get a gen­er­a­tion of peo­ple out­raged that some­body’s read­ing their email like you would’ve in the 70s get­ting a gen­er­a­tion of peo­ple out­raged that you’re read­ing their snail mail.

A Network of Sorrows: Small Adversaries and Small Allies

In an envi­ron­ment where every­body can pick up every­body’s tools, we’re all weird­ly empow­ered now. And I mean kind of weird in an almost fey sense like, our pow­ers are weird, they make us weird, and they make our our con­flicts weird. It’s again that idea that our tools are inter­act­ing with our human flaws in real­ly real­ly inter­est­ing ways.

Forbidden Research: Against the Law: Countering Lawful Abuses of Digital Surveillance

When I announced the talk on Twitter, some­body imme­di­ate­ly was like, Lawful abuse, isn’t that a con­tra­dic­tion?” But if you think about it for just a moment it might seem to be a lit­tle bit more clear. After all, the legal­i­ty of a thing is quite dis­tinct from the moral­i­ty of it.

Forbidden Research Welcome and Introduction: Cory Doctorow

At that moment when every­body is sud­den­ly car­ing about this stuff, that’s the moment at which nihilism can be avert­ed. It’s the moment in which nihilism must be avert­ed if you’re going to make a change. Peak indif­fer­ence is the moment when you stop con­vinc­ing peo­ple to care about an issue, and start con­vinc­ing them to do some­thing about it.

What Do Algorithms Know?

The Tyranny of Algorithms is obvi­ous­ly a polem­i­cal title to start a con­ver­sa­tion around com­pu­ta­tion and cul­ture. But I think that it helps us get into the cul­tur­al, the polit­i­cal, the legal, the eth­i­cal dimen­sions of code. Because we so often think of code, and code is so often con­struct­ed, in a pure­ly tech­ni­cal frame­work, by peo­ple who see them­selves as solv­ing tech­ni­cal prob­lems.

Cybersecurity in the Age of Always-Connected Sensors

We all see the ben­e­fits of active safe­ty sys­tems in cars. But that same safe­ty tech­nol­o­gy, if attacked, can actu­al­ly allow you to immo­bi­lize a vehi­cle or even dis­able breaks while dri­ving.

The Internet of Damned Things

We have to be aware that when you cre­ate mag­ic or occult things, when they go wrong they become hor­ror. Because we cre­ate tech­nolo­gies to soothe our cul­tur­al and social anx­i­eties, in a way. We cre­ate these things because we’re wor­ried about secu­ri­ty, we’re wor­ried about cli­mate change, we’re wor­ried about threat of ter­ror­ism. Whatever it is. And these devices pro­vide a kind of stop­gap for help­ing us feel safe or pro­tect­ed or what­ev­er.

Selfies & secu­ri­ty

We use the norms and tools soci­ety gives us to express the feel­ings we have about our­selves and oth­ers. But we’re vul­ner­a­ble, and this is proven even more­so with events like The Snappening, where thou­sands of sup­pos­ed­ly pri­vate images, and ephemer­al images, were leaked, many of which were nudes of young women.

Privacy, cen­sor­ship, and secu­ri­ty in the Middle East

So I got curi­ous, and I asked myself what is the Iranian Internet, and who is the Iranian user? I was pissed off enough, like I said, to take a step or to feel the urge to do some­thing. To feel the urge of mak­ing some­thing. And the thing that I real­ly want­ed to bring across was that cen­sor­ship is hap­pen­ing in a dif­fer­ent coun­try, where it’s being used to bring across infor­ma­tion, to make voic­es heard.

Threat Modeling and Operational Security

I fig­ured I would give a pre­sen­ta­tion to bet­ter explain the work that I do and show, hope­ful­ly not too tech­ni­cal, but show how you can think about the way you go about your online life and the traces you leave online, and what this means for the work that you do, the peo­ple you inter­act with, and so on.