[extropy-chat] Singularitarian verses singularity +HIGH IQS!

Joseph Bloch transhumanist at goldenfuture.net
Fri Dec 23 02:30:04 UTC 2005


gts wrote:

> On Thu, 22 Dec 2005 04:14:28 -0500, Marc Geddes 
> <marc.geddes at gmail.com>  wrote:
>
>>> What are cognitive processes for, if not for achieving contentment and
>>> survival?
>>
>>
>> Humans have the survival urge because evolution made us that way.   
>> There's no reason why this urge would apply to intelligences in general.
>
>
> By your definition then, a super-intelligence might destroy itself for 
> no  particular reason.  But to me this would seem a very unintelligent 
> thing  to do.


Only if self-survival is a priority. I think what Marc was getting at is 
that self-survival is a priority for us, because evolutionary pressures 
selected against those individuals for whom it was not a priority. A 
hypothetical superintelligence, not having been necessarily subject to 
those same sorts of evolutionary pressures, might not have self-survival 
as a priority.

I could also point out that self-sacrifice for a cause could easily be 
seen as an "intelligent thing to do"-- from the standpoint of the 
superintelligence. A superintelligence might well come to the conclusion 
that it's continued existence was not optimal for the well-being of the 
universe, and destroy itself. We, not possessing superintelligence, 
might not recognize this as being the case, and interpret it as a random 
act done for no particular reason, from our limited perspective.

What mortal can know the motives of the Gods?

Joseph



More information about the extropy-chat mailing list