[ExI] AI extinction risk

Anders Sandberg anders at aleph.se
Fri Mar 14 14:02:12 UTC 2014


BillK <pharos at gmail.com> , 14/3/2014 12:22 PM:

To me this looks like a very strong near-time prediction. It doesn't 
even need superhuman AI to produce a lot of unemployment. 
Yes. Yesterday Carl Frey (economist, also from FHI) and Michael Osborne (machine learning) gave a good talk about it:http://www.oxfordmartin.ox.ac.uk/videos/view/375

Globalization has been described as 'Using third world slave labour to 
produce cheap goods to sell to our first world unemployed population'. 
Which is of course not a good description, given how the income of wealth of the developing world has been growing over the past decades. Check Rosling's graphs.
But a big system change seems inevitable. 
Yes. The current system is *always* getting replaced, but sometimes in a more dramatic fashion. 
A rather unsettling argument is http://qz.com/185945/drones-are-about-to-upheave-society-in-a-way-we-havent-seen-in-700-years/This notices that some basic assumptions about how our societies work over the past centuries might have been based on the limitations of concentration and control of force. Not entirely convinced he is 100% right, but clearly even getting the option of guaranteed loyal robot armies (that can be manufactured by having capital rather than labour) is going to change things a lot. It is a bit like Drexler's considerations in http://e-drexler.com/d/06/00/EOC/EOC_Chapter_11.html ("But with advanced technology, states need not control people - they could instead simply discard people. ") Basically, with enough automation one could amplify power distances a lot. 
The AI disaster scenario splits into two cases: the good old "superintelligence out of control" we have spent much effort at handling, and the "AI empowered people out of control" scenario, which is a tricky 'thick' socioeconomical problem. 

Anders Sandberg, Future of Humanity Institute Philosophy Faculty of Oxford University
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20140314/ffbfd8cd/attachment.html>


More information about the extropy-chat mailing list