Archive (Page 1 of 4)

Elizabeth Feinler’s Internet Hall of Fame 2012 Profile

I’m Elizabeth Feinler, usu­al­ly known as Jake.” That’s my nick­name. And I ran the con­tract for the Network Information Center on both the ARPANET and the Defense Data Network back in the 70s and 80s.

The Conversation #45 — James Bamford

You’re not going to get a gen­er­a­tion of peo­ple out­raged that somebody’s read­ing their email like you would’ve in the 70s get­ting a gen­er­a­tion of peo­ple out­raged that you’re read­ing their snail mail.

Online Platforms as Human Rights Arbiters

What does it mean for human rights pro­tec­tion that we have large cor­po­rate interests—the Googles, the Facebooks of our time—that con­trol and gov­ern a large part of the online infra­struc­ture?

Decoding Workforce Productivity: Nita A. Farahany

Are there any lim­its to the con­nect­ed work­place? Are there any con­cerns about the con­nect­ed work­place? Is there any way in which you wouldn’t want either your­self or an employ­ee to be con­nect­ed? Are there any lim­its to the kinds of infor­ma­tion we can gath­er in order to make our work­forces more pro­duc­tive? In order to make our over­all soci­ety more pro­duc­tive?

Forbidden Research Welcome and Introduction: Cory Doctorow

At that moment when every­body is sud­den­ly car­ing about this stuff, that’s the moment at which nihilism can be avert­ed. It’s the moment in which nihilism must be avert­ed if you’re going to make a change. Peak indif­fer­ence is the moment when you stop con­vinc­ing peo­ple to care about an issue, and start con­vinc­ing them to do some­thing about it.

What Do Algorithms Know?

The Tyranny of Algorithms is obvi­ous­ly a polem­i­cal title to start a con­ver­sa­tion around com­pu­ta­tion and cul­ture. But I think that it helps us get into the cul­tur­al, the polit­i­cal, the legal, the eth­i­cal dimen­sions of code. Because we so often think of code, and code is so often con­struct­ed, in a pure­ly tech­ni­cal frame­work, by peo­ple who see them­selves as solv­ing tech­ni­cal prob­lems.

Holding To Account

I’m glad those social net­works pro­vide those ser­vices. I think it’s impor­tant for the dia­logue to hap­pen that way. But it can’t be the only way for us to have pub­lic dis­course. Online, we only have these spaces that are owned by pri­vate com­pa­nies. We don’t have pub­lic parks.

The Conversation #4 — Colin Camerer

We know very lit­tle about com­plex finan­cial sys­tems and how sys­temic risk, as it’s called, is com­put­ed and how you would man­age poli­cies. And if you look back at the finan­cial cri­sis, you can either say, as many econ­o­mists do, It all had to do with badly-designed rules,” which may be part of the sto­ry; it’s cer­tain­ly part of the sto­ry. Or it may have to do with the inter­ac­tion of those rules and human nature, like mort­gage bro­ker greed, opti­mism… And you see it not just in indi­vid­u­als who now have hous­es and fore­clo­sure, but at the high­est lev­els.

Cybersecurity in the Age of Always-Connected Sensors

We all see the ben­e­fits of active safe­ty sys­tems in cars. But that same safe­ty tech­nol­o­gy, if attacked, can actu­al­ly allow you to immo­bi­lize a vehi­cle or even dis­able breaks while dri­ving.

Personal Privacy Assistants in the Age of the Internet of Things

Imagine your pri­va­cy assis­tant is a com­put­er pro­gram that’s run­ning on your smart­phone or your smart­watch. Your pri­va­cy assis­tant lis­tens for pri­va­cy poli­cies that are being broad­cast over a dig­i­tal stream. We are build­ing stan­dard for­mats for these pri­va­cy poli­cies so that all sen­sors will speak the same lan­guage that your per­son­al pri­va­cy assis­tant will be able to under­stand.

Page 1 of 4

Powered by WordPress & Theme by Anders Norén