[extropy-chat] Singularitarian verses singularity +HIGH IQS!

gts gts_2000 at yahoo.com
Fri Dec 23 01:59:17 UTC 2005


On Thu, 22 Dec 2005 04:14:28 -0500, Marc Geddes <marc.geddes at gmail.com>  
wrote:

>> What are cognitive processes for, if not for achieving contentment and
>> survival?
>
> Humans have the survival urge because evolution made us that way.   
> There's no reason why this urge would apply to intelligences in general.

By your definition then, a super-intelligence might destroy itself for no  
particular reason.  But to me this would seem a very unintelligent thing  
to do.

> Further different intelligences would have quite different definitions  
> of what
> consistutes 'contentment'.

I agree 'contentment' is a tricky concept. I included it with some doubts,  
but 'survival' seems much more basic to any definition intelligence. I  
consider all living things to have some basic level of machine  
intelligence programmed in their DNA, intelligence which promotes survival  
of the organism and, more generally, its genes.

>  One guy might be content with chocolate ice
> cream, but someone else likes vinella.  Specific end goals can't suffice  
> to give a general definition of intelligence.

In both cases they are seeking contentment.

> See above.  Specific ends cannot suffice for a general definition of
> intelligence.  Incidentally, my proposed definition is both a means *and*
> and end.

If intelligence is not for survival then explain why a super-intelligence  
would not randomly self-destruct. Or, if you believe it might randomly  
self-destruct, then explain why you would consider it intelligent.

-gts




More information about the extropy-chat mailing list