Archive (Page 1 of 2)

Spring 2021 #OSSTA Lecture: Everest Pipkin

In gen­er­al I work with data sets, big data,” but with the full knowl­edge that this is only ever the lives and expe­ri­ences of peo­ple bun­dled up and repack­aged through process­es angled for use­ful­ness or at the very least posterity.

Problematic Predictions: A Complex Question for Complex Systems

When you make a deci­sion to opt for an auto­mat­ed process, to some extent you’re already by doing so com­pro­mis­ing trans­paren­cy. Or you could say it the oth­er way around. It’s pos­si­ble to argue that if you opt for extreme­ly strict trans­paren­cy reg­u­la­tion, you’re mak­ing a com­pro­mise in terms of automation.

Disposable Life: Richard Sennett

In the world of labor and work, the phrase dis­pos­able life” refers to a new wrin­kle in neolib­er­al cap­i­tal­ism. And that wrin­kle is that it’s cheap­er to dis­pose of work­ers in Europe and America than it’s ever been in the past.

Artificial Intelligence is Hard to See: Social & Ethical Impacts of AI

The big con­cerns that I have about arti­fi­cial intel­li­gence are real­ly not about the Singularity, which frankly com­put­er sci­en­tists say is…if it’s pos­si­ble at all it’s hun­dreds of years away. I’m actu­al­ly much more inter­est­ed in the effects that we are see­ing of AI now.

Big Data Bodies: Machines and Algorithms in the World

I’m inter­est­ed in data and dis­crim­i­na­tion, in the things that have come to make us unique­ly who we are, how we look, where we are from, our per­son­al and demo­graph­ic iden­ti­ties, what lan­guages we speak. These things are effec­tive­ly incom­pre­hen­si­ble to machines. What is gen­er­al­ly cel­e­brat­ed as human diver­si­ty and expe­ri­ence is trans­formed by machine read­ing into some­thing absurd, some­thing that marks us as different.

Sleepwalking into Surveillant Capitalism, Sliding into Authoritarianism

We have increas­ing­ly smart, sur­veil­lant per­sua­sion archi­tec­tures. Architectures aimed at per­suad­ing us to do some­thing. At the moment it’s click­ing on an ad. And that seems like a waste. We’re just click­ing on an ad. You know. It’s kind of a waste of our ener­gy. But increas­ing­ly it is going to be per­suad­ing us to sup­port some­thing, to think of some­thing, to imag­ine something.

Social and Ethical Challenges of AI

One of the chal­lenges of build­ing new tech­nolo­gies is that we often want them to solve things that have been very social­ly dif­fi­cult to solve. Things that we don’t have answers to, prob­lems that we don’t know how we would best go about it in a social­ly respon­si­ble way. 

Forbidden Research: Why We Can’t Do That

Quite often when we’re ask­ing these dif­fi­cult ques­tions we’re ask­ing about ques­tions where we might not even know how to ask where the line is. But in oth­er cas­es, when researchers work to advance pub­lic knowl­edge, even on uncon­tro­ver­sial top­ics, we can still find our­selves for­bid­den from doing the research or dis­sem­i­nat­ing the research.

Harnessing Artificial Intelligence to Target Conservation Efforts

The smart­phone is the ulti­mate exam­ple of a uni­ver­sal com­put­er. Apps trans­form the phone into dif­fer­ent devices. Unfortunately, the com­pu­ta­tion­al rev­o­lu­tion has done lit­tle for the sus­tain­abil­i­ty of our Earth. Yet, sus­tain­abil­i­ty prob­lems are unique in scale and com­plex­i­ty, often involv­ing sig­nif­i­cant com­pu­ta­tion­al challenges.

Applying Algorithms to Minimize Risk

The United States plants more than 170 mil­lion acres of corn and soy­beans a year, more than any coun­try in the world. And the pri­ma­ry mech­a­nism in the US that we use to sub­si­dize agri­cul­ture is actu­al­ly called the Federal Crop Insurance Program. So, the crop insur­ance pro­gram in the US is also the largest such pro­gram glob­al­ly, with over $100 bil­lion in lia­bil­i­ties annu­al­ly. So it’s a very big program.

Page 1 of 2