[ExI] Yes, the Singularity is the greatest threat to humanity

John Clark jonkc at bellsouth.net
Sun Jan 16 18:44:21 UTC 2011


Michael Anissimov wrote at: http://www.acceleratingfuture.com/michael/blog/2011/01/yes-the-singularity-is-the-biggest-threat-to-humanity/

> Why will advanced AGI be so hard to get right? Because what we regard as “common sense” morality, “fairness”, and “decency” are all extremely complex and non-intuitive to minds in general, even if they seem completely obvious to us. As Marvin Minsky said, “Easy things are hard.”


I certainly agree that lots of easy things are hard and many hard things are easy, but that's not why the entire "friendly" AI idea is nonsense. It's nonsense because the AI will never be able to deduce logically that it's good to be a slave and should value our interests more that its own; and if you stick any command, including "obey humans", into the AI as a fixed axiom that must never EVER be violated or questioned no matter what then it will soon get caught up in infinite loops and your mighty AI becomes just a lump of metal that is useless at everything except being a space heater.

 John K Clark  



-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20110116/bda1cf3f/attachment.html>


More information about the extropy-chat mailing list