[ExI] Hard Takeoff
Michael Anissimov
michaelanissimov at gmail.com
Tue Nov 16 03:03:48 UTC 2010
Heya Spike,
On Sun, Nov 14, 2010 at 10:10 PM, spike <spike66 at att.net> wrote:
>
> I am not advocating a Bill Joy approach of eschewing AI research, just the
> opposite. A no-singularity future is 100% lethal to every one of us, every
> one of our children and their children forever. A singularity gives us
> some
> hope, but also much danger. The outcome is far less predictable than
> nuclear fission.
>
Would you say the same thing if the Intelligence Explosion were initiated by
the most trustworthy and altruistic human being in the world, if one could
be found?
In general, I agree with you except the last sentence.
--
michael.anissimov at singinst.org
Singularity Institute
Media Director
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20101115/59406465/attachment.html>
More information about the extropy-chat
mailing list