[extropy-chat] Singularitarian verses singularity +HIGH IQS!
gts
gts_2000 at yahoo.com
Fri Dec 23 03:22:22 UTC 2005
On Thu, 22 Dec 2005 21:30:04 -0500, Joseph Bloch
<transhumanist at goldenfuture.net> wrote:
> Only if self-survival is a priority. I think what Marc was getting at is
> that self-survival is a priority for us, because evolutionary pressures
> selected against those individuals for whom it was not a priority. A
> hypothetical superintelligence, not having been necessarily subject to
> those same sorts of evolutionary pressures, might not have self-survival
> as a priority.
Except that evolution continues. Natural selection would select for
super-intelligences for which self-survival was a priority. A
super-intelligence might see this fact and make survival a priority.
> I could also point out that self-sacrifice for a cause...
Yes, but self-sacrifice for a cause is something different.
It would be nice to imagine super-intelligences that identify with the
human species, identities whose super-intelligence is for promoting
survival of humans rather than their own, who might even sacrifice
themselves for humans. But then of course we already do.
> A superintelligence might well come to the conclusion that it'scontinued
> existence was not optimal for the well-being of the universe, and
> destroy itself.
After a last supper. :)
-gts
More information about the extropy-chat
mailing list