[extropy-chat] Singularitarian verses singularity +HIGH IQS!

gts gts_2000 at yahoo.com
Thu Dec 22 08:55:05 UTC 2005


On Thu, 22 Dec 2005 01:52:29 -0500, Marc Geddes <m_j_geddes at yahoo.com.au>  
wrote:

> No.  'Contentment' and 'survival' are human goals.  No
> reason why super-intelligences should have them.

Really? I think super-intelligences might disagree.

> Further, any definition of intelligence should surely
> involve cognitive processes, not end goals.

What are cognitive processes for, if not for achieving contentment and  
survival?

> Can you come up with a better definition than the
> three I gave above?  I've been thinking about this
> for... what... 3 years now.

I like your three definitions. I just think my fourth subsumes your first  
three in a manner analogous to the way you think your third subsumes your  
first two. Yours are about the means of intelligence, mine is about the  
ends.

Future super-intelligences are, to my way of thinking, just another step  
along the path of evolution, and evolution is about survival.

For whatever reason, objects in reality have a tendency to persist. In  
sentient things that tendency becomes a will to persist.

-gts




More information about the extropy-chat mailing list