[ExI] The symbol grounding problem in strong AI.

John Clark jonkc at bellsouth.net
Sun Jan 10 17:53:24 UTC 2010


On Jan 9, 2010 Gordon Swobe wrote:

> I think consciousness will likely turn out to be just a state that natural brains can enter not unlike water can enter a state of solidity.

In a way I sort of agree with that, but I don't see why a computer couldn't do the same thing. And not all water is solid, it's a function of temperature. In your analogy what is the equivalent of temperature? We have enormously powerful evidence that it must be intelligence. We know from direct experience that there is a one to one correspondence between consciousness and intelligence; when we are intelligent we are conscious and when we are not intelligent, as in when we are sleeping or under anesthesia, we are not conscious. 

> Some people seem to think that if we can compute X on a computer then a computer simulation of X must equal X. But that's just a blatant non sequitur.  

So if I add 2+2 on my computer and you add 2+2 on your computer it's a blatant non sequitur to think that my 4 is the same as your 4.

  John K Clark
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20100110/2a6ac20f/attachment.html>


More information about the extropy-chat mailing list