I think the question I’m trying to formulate is, how in this world of increasing optimization where the algorithms will be accurate… They’ll increasingly be accurate. But their application could lead to discrimination. How do we stop that?
Archive
The panel will examine how real world-biases and inequality are replicated and systematically integrated into neutral algorithms and databases.
I often try to tell people that Google is not providing information retrieval algorithms, it’s providing advertising algorithms. And that is a very important distinction when we think about what kind of information is available in these corporate-controlled spaces.
When we think about network graphs and we talk about how the network effects that make up an important part of how social movements and how information is distributed online, there’s this assumption in those visualizations that every node in that network is equal. But very often, and you can slice data in many different way, the languages that we speak actually limit the networks that we have access to and that we’re interacting with.
What I’m arguing primarily today is that focusing on pedagogy is a key aspect of social justice work, and that teaching critical data literacy along with other digital literacy skills is a key part of what we need to do.
I think that we need a radical design change. And I might ask if I were teaching an HCI class or design class with you, I would say, “How are you going to design this so that not one life is lost?” What if that were the design imperative rather than what’s your IPO going to be?