[ExI] Jaron Lanier on AI and Singularity, not transhumanism

Ben Zaiboc bbenzai at yahoo.com
Tue Aug 10 11:10:58 UTC 2010


Sabrina Ballard wrote:
"And transhumanism believe in becoming a human that modern day humans
would not understand"


Transhumanism is the idea that we can transcend our current limitations.  This may lead to the side-effect of becoming beings that current humans wouldn't understand, but that's not the aim.

The 'core principle' of transhumanism, as it's understood by most people who would call themselves transhumanists, is that we can and should use technology to improve ourselves.  This includes using science and technology to understand ourselves and the world, applying what we learn to make things better, give people more autonomy, more choice and more freedom, and spread intelligence and awareness as far as possible.

There are visions of transhumanism that don't include a singularity (although, depending on how you define it, I think it will be difficult to avoid one).

Ben Zaiboc


      





More information about the extropy-chat mailing list