Archive (Page 1 of 4)

AI Blindspot

AI Blindspot is a dis­cov­ery process for spot­ting uncon­scious bias­es and struc­tur­al inequal­i­ties in AI systems.

What We Really Mean When We Say Ethics”

The Markkula Center for Applied Ethics at Santa Clara University has some real­ly use­ful think­ing and cur­ric­u­la around ethics. One of the things they point out is that what ethics is not is eas­i­er to talk about than what ethics actu­al­ly is. And some of the things that they say about what ethics is not include feel­ings. Those aren’t ethics. And reli­gion isn’t ethics. Also law. That’s not ethics. Science isn’t ethics.

Public Accountability in Research Ethics

Experimentation is so com­mon­place on the Internet now that if you use a plat­form like Facebook you’re prob­a­bly part of many exper­i­ments all the time.

Freedom From Consequences

Lack of attachment…does not mean dis­con­nec­tion from the world and oth­ers, but abil­i­ty to con­trol what can be con­trolled: the self, rather than con­se­quences. It is about detach­ing our moti­va­tion from the results of an action so we do not lose sight of what else is impor­tant: a sense of perspective. 

Virtual Futures Salon: Fucking Machines

We are here to talk about fuck­ing machines. In London, on a fog­gy evening, on a Tuesday, for yet anoth­er debate about fuck­ing machines. Another curat­ed dis­cus­sion under­lined by our own human inse­cu­ri­ty about ver­sions of us in sil­i­ca. Fucking anthro­po­mor­phic fuck­ing machines. Machines that fuck us. And let’s face it, machines are already fuck­ing us, or so we seem to be told.

Ethical Machines episode 1: Mark Riedl

Computers can tell sto­ries but they’re always sto­ries that humans have input into a com­put­er, which are then just being regur­gi­tat­ed. But they don’t make sto­ries up on their own. They don’t real­ly under­stand the sto­ries that we tell. They’re not kind of aware of the cul­tur­al impor­tance of sto­ries. They can’t watch the same movies or read the same books we do. And this seems like this huge miss­ing gap between what com­put­ers can do and humans can do if you think about how impor­tant sto­ry­telling is to the human condition. 

Are We Living Inside an Ethical (and Kind) Machine?

This is a moment to ask as we make the plan­et dig­i­tal, as we total­ly envel­op our­selves in the com­put­ing envi­ron­ment that we’ve been build­ing for the last hun­dred years, what kind of dig­i­tal plan­et do we want? Because we are at a point where there is no turn­ing back, and get­ting to eth­i­cal deci­sions, val­ues deci­sions, deci­sions about democ­ra­cy, is not some­thing we have talked about enough nor in a way that has had impact.

The Spawn of Frankenstein: Unintended Consequences

Victor’s sin was­n’t in being too ambi­tious, not nec­es­sar­i­ly in play­ing God. It was in fail­ing to care for the being he cre­at­ed, fail­ing to take respon­si­bil­i­ty and to pro­vide the crea­ture what it need­ed to thrive, to reach its poten­tial, to be a pos­i­tive devel­op­ment for soci­ety instead of a disaster.

The Spawn of Frankenstein: It’s Alive

Mary Shelley’s nov­el has been an incred­i­bly suc­cess­ful mod­ern myth. And so this con­ver­sa­tion today is not just about what hap­pened 200 years ago, but the remark­able ways in which that moment and that set of ideas has con­tin­ued to per­co­late and evolve and reform in cul­ture, in tech­no­log­i­cal research, in ethics, since then.

The Spawn of Frankenstein: Playing God

In Shelley’s vision, Frankenstein was the mod­ern Prometheus. The hip, up to date, learned, vital god who chose to cre­ate human life and paid the dire con­se­quences. To Shelley, gods cre­ate and for humans to do that is bad. Bad for oth­ers but espe­cial­ly bad for one’s creator. 

Page 1 of 4