Archive (Page 2 of 4)

Holding To Account

I’m glad those social net­works pro­vide those ser­vices. I think it’s impor­tant for the dia­logue to hap­pen that way. But it can’t be the only way for us to have pub­lic dis­course. Online, we only have these spaces that are owned by pri­vate com­pa­nies. We don’t have pub­lic parks.

The Conversation #4 — Colin Camerer

We know very lit­tle about com­plex finan­cial sys­tems and how sys­temic risk, as it’s called, is com­put­ed and how you would man­age poli­cies. And if you look back at the finan­cial cri­sis, you can either say, as many econ­o­mists do, It all had to do with badly-designed rules,” which may be part of the sto­ry; it’s cer­tain­ly part of the sto­ry. Or it may have to do with the inter­ac­tion of those rules and human nature, like mort­gage bro­ker greed, opti­mism… And you see it not just in indi­vid­u­als who now have hous­es and fore­clo­sure, but at the high­est lev­els.

Cybersecurity in the Age of Always-Connected Sensors

We all see the ben­e­fits of active safe­ty sys­tems in cars. But that same safe­ty tech­nol­o­gy, if attacked, can actu­al­ly allow you to immo­bi­lize a vehi­cle or even dis­able breaks while dri­ving.

Personal Privacy Assistants in the Age of the Internet of Things

Imagine your pri­va­cy assis­tant is a com­put­er pro­gram that’s run­ning on your smart­phone or your smart­watch. Your pri­va­cy assis­tant lis­tens for pri­va­cy poli­cies that are being broad­cast over a dig­i­tal stream. We are build­ing stan­dard for­mats for these pri­va­cy poli­cies so that all sen­sors will speak the same lan­guage that your per­son­al pri­va­cy assis­tant will be able to under­stand.

The Internet of Damned Things

We have to be aware that when you cre­ate mag­ic or occult things, when they go wrong they become hor­ror. Because we cre­ate tech­nolo­gies to soothe our cul­tur­al and social anx­i­eties, in a way. We cre­ate these things because we’re wor­ried about secu­ri­ty, we’re wor­ried about cli­mate change, we’re wor­ried about threat of ter­ror­ism. Whatever it is. And these devices pro­vide a kind of stop­gap for help­ing us feel safe or pro­tect­ed or what­ev­er.

Data & Society Databite #41: Ifeoma Ajunwa on Genetic Coercion

The mythol­o­gy of genet­ic coer­cion is thoughts that genet­ic data, espe­cial­ly large-scale genet­ic data­bas­es, have the abil­i­ty to pin­point cer­tain risk of dis­ease. They pro­vide agency to act to pre­vent such dis­ease, and it can be used to cre­ate accu­rate per­son­al­ized treat­ment for dis­ease, and it should also be entrust­ed with the author­i­ty to dic­tate the mod­i­fi­ca­tion of the genome for future gen­er­a­tions.

Sin in the Time of Technology

Social media com­pa­nies have an unpar­al­leled amount of influ­ence over our mod­ern com­mu­ni­ca­tions. […] These com­pa­nies also play a huge role in shap­ing our glob­al out­look on moral­i­ty and what con­sti­tutes it. So the ways in which we per­ceive dif­fer­ent imagery, dif­fer­ent speech, is being increas­ing­ly defined by the reg­u­la­tions that these plat­forms put upon us [in] our dai­ly activ­i­ties on them.

When Algorithms Fail in Our Personal Lives

I won­der with all these vary­ing lev­els of needs that we have as users, and as we live more and more of our lives dig­i­tal­ly and on social media, what would it look like to design a semi-private space in a pub­lic net­work?

No, Thank You: Agency, Imagination, and Possibilities for Rejecting World-Changing Tech

We’re try­ing to say it’s on you, it’s your respon­si­bil­i­ty, fig­ure this out, down­load this, under­stand end-to-end encryp­tion, when it’s a shared prob­lem and it’s a com­mu­nal prob­lem.

Ask a Prison Librarian about Privacy, Technology, and State Control

What does it mean to be pri­vate when you’re in a place where you have no right to pri­va­cy but are iron­i­cal­ly deprived of the thing that makes your pri­va­cy go away?