[ExI] Jaron Lanier on AI and Singularity, not transhumanism

John Clark jonkc at bellsouth.net
Tue Aug 10 05:08:09 UTC 2010


On Aug 10, 2010, at 12:12 AM, Gregory Jones wrote:

> It is perfectly consistent logically to assume humans remain in our present form with our present bad habits, right up until the singularity occurs, then assume that the singularity doesn't need us, so by some mysterious means we are no more.  That would be an example of a non-transhumanist singularitarian.

No it wouldn't, after the singularity there would be entities that are beyond human, way beyond, trans-human. A better example of what you're talking about would be World War 3 or a large asteroid impact, but that's not the sort of singularity a singularitarian usually means.

  John K Clark


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20100810/aae028f5/attachment.html>


More information about the extropy-chat mailing list