I think the question I’m trying to formulate is, how in this world of increasing optimization where the algorithms will be accurate… They’ll increasingly be accurate. But their application could lead to discrimination. How do we stop that?
I’m interested in data and discrimination, in the things that have come to make us uniquely who we are, how we look, where we are from, our personal and demographic identities, what languages we speak. These things are effectively incomprehensible to machines. What is generally celebrated as human diversity and experience is transformed by machine reading into something absurd, something that marks us as different.
People think that the Civil Rights Movement and all big epochal movements involve conscience, and they do. They also involve consciousness. I mean, you can’t struggle against what you’re unaware off, right? The Klan as the iconic carriers of violence, the Bull Connor of the iconic southern white male resistance, George Wallace the iconic neopopulist racist. You know, these were historic figures in myth and reality. But we wouldn’t get to what they represented till much later.
This talk is more about the coercion of labor into open source software. So I want to take a critical look at how we can engage businesses and other stakeholders in technology companies to begin to create a more equal and sustainable environment for all people contributing to open source.