In a series of short talks we’re going to share examples of some of our past and upcoming work, alongside examples from our parent organization Global Voices. But I want to start by saying something about how we go about our work.
Archive (Page 1 of 2)
One of the things I found interesting about both of your conversations is that as we start to see code becoming a powerful force in society, we’re no longer just trying to change laws but we find ourselves—just as we’re citizens trying to encourage the government or congresspeople to change laws—we’re now standing outside of companies saying well, there’s code that affects our lives.
Liberal users comprise a larger percentage of these r/politics users, while conservatives will comprise a smaller percentage. Through those users and through their voting, they can control what is seen and what is not seen. So a liberal user, as a block, will downvote more often than not something they don’t agree with necessarily.
r/science is really the largest science forum on the Internet. We say that we have more than 18 million subscribed users. For a point of reference, the total combined subscriber base of the top ten newspapers in the United States is around ten million.
Experimentation is so commonplace on the Internet now that if you use a platform like Facebook you’re probably part of many experiments all the time.
Underlying this project is a pretty simple and we think powerful idea that provides a solution to a complex challenge that’s facing online communities like Twitter, like Reddit, within the CivilServant universe. That challenge is the increasing automation of the enforcement of legal rules and norms online.
Public Lab is a community and a nonprofit, and we do environmental work with people all over the world. And we really try to address environmental issues that affect people. What we do we call community science.
I’m a professor here in comparative media studies and I’m codirector of an organization called AnyKey which I’ll tell you a little bit about today. We launched 2016 with the help of Intel and ESL. We’re an organization dedicated to fairness, equity, and inclusivity in gaming and in particular esports.
I consider myself to be an algorithm auditor. So what does that mean? Well, I’m inherently a suspicious person. When I start interacting with a new service, or a new app, and it appears to be doing something dynamic, I immediately been begin to question what is going on inside the black box, right? What is powering these dynamics? And ultimately what is the impact of this?
When I talk about online harassment I’m referring to a very broad spectrum of abusive behaviors that are enabled by technology platforms and used to target a user or a group of users. So this can be anything from a flaming or the use of personal insults or inflammatory language, to things like doxing or revealing or broadcasting personal information about someone such as a phone number or address, to things like stalking and impersonation and things of that nature.