Archive (Page 2 of 4)

Cybersecurity in the Age of Always-Connected Sensors

We all see the ben­e­fits of active safe­ty sys­tems in cars. But that same safe­ty tech­nol­o­gy, if attacked, can actu­al­ly allow you to immo­bi­lize a vehi­cle or even dis­able breaks while driving.

Personal Privacy Assistants in the Age of the Internet of Things

Imagine your pri­va­cy assis­tant is a com­put­er pro­gram that’s run­ning on your smart­phone or your smart­watch. Your pri­va­cy assis­tant lis­tens for pri­va­cy poli­cies that are being broad­cast over a dig­i­tal stream. We are build­ing stan­dard for­mats for these pri­va­cy poli­cies so that all sen­sors will speak the same lan­guage that your per­son­al pri­va­cy assis­tant will be able to understand.

The Internet of Damned Things

We have to be aware that when you cre­ate mag­ic or occult things, when they go wrong they become hor­ror. Because we cre­ate tech­nolo­gies to soothe our cul­tur­al and social anx­i­eties, in a way. We cre­ate these things because we’re wor­ried about secu­ri­ty, we’re wor­ried about cli­mate change, we’re wor­ried about threat of ter­ror­ism. Whatever it is. And these devices pro­vide a kind of stop­gap for help­ing us feel safe or pro­tect­ed or whatever.

Data & Society Databite #41: Ifeoma Ajunwa on Genetic Coercion

The mythol­o­gy of genet­ic coer­cion is thoughts that genet­ic data, espe­cial­ly large-scale genet­ic data­bas­es, have the abil­i­ty to pin­point cer­tain risk of dis­ease. They pro­vide agency to act to pre­vent such dis­ease, and it can be used to cre­ate accu­rate per­son­al­ized treat­ment for dis­ease, and it should also be entrust­ed with the author­i­ty to dic­tate the mod­i­fi­ca­tion of the genome for future generations. 

Sin in the Time of Technology

Social media com­pa­nies have an unpar­al­leled amount of influ­ence over our mod­ern com­mu­ni­ca­tions. […] These com­pa­nies also play a huge role in shap­ing our glob­al out­look on moral­i­ty and what con­sti­tutes it. So the ways in which we per­ceive dif­fer­ent imagery, dif­fer­ent speech, is being increas­ing­ly defined by the reg­u­la­tions that these plat­forms put upon us [in] our dai­ly activ­i­ties on them. 

When Algorithms Fail in Our Personal Lives

I won­der with all these vary­ing lev­els of needs that we have as users, and as we live more and more of our lives dig­i­tal­ly and on social media, what would it look like to design a semi-private space in a pub­lic network?

No, Thank You: Agency, Imagination, and Possibilities for Rejecting World-Changing Tech

We’re try­ing to say it’s on you, it’s your respon­si­bil­i­ty, fig­ure this out, down­load this, under­stand end-to-end encryp­tion, when it’s a shared prob­lem and it’s a com­mu­nal problem.

Ask a Prison Librarian about Privacy, Technology, and State Control

What does it mean to be pri­vate when you’re in a place where you have no right to pri­va­cy but are iron­i­cal­ly deprived of the thing that makes your pri­va­cy go away?

If You Build It, They Won’t Care: Designing Privacy-Preserving Technologies for People with Other Interests

I think that pri­va­cy is some­thing that we can think of in terms of a civ­il right, as indi­vid­u­als. […] That’s a civ­il rights issue. But I think there’s also a way to think about it in terms of a social issue that’s larg­er than sim­ply the individual.

Human Rights Meets Design Challenges

How do we take this right that you have to your data and put it back in your hands, and give you con­trol over it? And how do we do this not just from a tech­no­log­i­cal per­spec­tive but how do we do it from a human perspective? 

Powered by WordPress & Theme by Anders Norén