[ExI] The symbol grounding problem in strong AI.

John Clark jonkc at bellsouth.net
Mon Jan 4 16:49:44 UTC 2010


On Jan 3, 2010, at 2:42 PM, Gordon Swobe wrote:

>> The operative word in the above is "evolved". Why did this mysterious
>> "subjective symbol grounding" (bafflegab translation: consciousness) 
>> evolve? 
> 
> To help you communicate better with other monkeys, among other things.

So consciousness effects behavior and say goodbye to the Chinese room.

> I think you really want to ask how it happened that humans did not evolve as unconscious zombies. Why did evolution select consciousness? I think one good answer is that perhaps nature finds it cheaper when its creatures have first-person awareness of the things they do and say. 

So it's easier to make a conscious intelligence than an unconscious one.
> 
> We would probably find it more efficient in computers also.

So if you ever run across a intelligent computer you can be certain its conscious. Or at least as certain as you are about your fellow human beings are conscious when they act intelligently. 

 John K Clark

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20100104/81db552e/attachment.html>


More information about the extropy-chat mailing list