One of the challenges of building new technologies is that we often want them to solve things that have been very socially difficult to solve. Things that we don’t have answers to, problems that we don’t know how we would best go about it in a socially responsible way.
Archive (Page 2 of 4)
In a world of conflicting values, it’s going to be difficult to develop values for AI that are not the lowest common denominator.
I think one of the things I want to say from the start is it’s not like AI is going to appear. It’s actually out there, in some instances in ways that we never even notice.
Machine learning systems that we have today have become so powerful and are being introduced into everything from self-driving cars, to predictive policing, to assisting judges, to producing your news feed on Facebook on what you ought to see. And they have a lot of societal impacts. But they’re very difficult to audit.
I think there are countless amazing opportunities for artificial intelligence and its impact on society. I think one of the areas I’m truly the most excited about is education.
I think developments in artificial intelligence do pose a strong challenge for humanity. I think at a very fundamental level, people don’t quite understand what artificial intelligence is, yet it’s used as a buzzword that’s going to solve every single problem.
Some of the long-term challenges are very hypothetical—we don’t really know if they will ever materialize in this way. But in the short term I think AI poses some regulatory challenges for society.
This talk is more about the coercion of labor into open source software. So I want to take a critical look at how we can engage businesses and other stakeholders in technology companies to begin to create a more equal and sustainable environment for all people contributing to open source.
The best justification we have for killing fifty-six, fifty-seven, whatever billion land animals and a trillion sea animals every year is that they taste good. And so, in a sense how is this any different from Michael Vick, who likes to sit around a pit watching dogs fight, or at least he used to?
Solar geoengineering rests on a simple idea that it is technically possible to make the Earth a little more reflective so that it absorbs a little less sunlight, which would partly counteract some of the risks that come from accumulating carbon dioxide in the atmosphere. When I say technically possible, it appears that at least doing this in a crude way is actually easy, in the sense that it could be done with commercial off-the-shelf technologies now, and it could be done at a cost that is really trivial, sort of a part in a thousand or a part in ten thousand of global GDP.