[ExI] The symbol grounding problem in strong AI
Gordon Swobe
gts_2000 at yahoo.com
Sat Jan 9 22:14:55 UTC 2010
--- On Sat, 1/9/10, Damien Broderick <thespike at satx.rr.com> wrote:
>> how can I say this without assigning some kind of
>> strange non-computable aspect to natural brains.
>> The answer is that I say it because I don't believe
>> the brain is actually a computer.
>
> Isn't that exactly saying that you assign some kind of
> non-computable aspect to natural brains?
No, I think consciousness will likely turn out to be just a state that natural brains can enter not unlike water can enter a state of solidity. Nothing strange or dualistic or non-physical or non-computable about it!
But the computer simulation of it won't have consciousness any more than will a simulation of an ice cube have coldness. Computer simulations of things do not equal the things they simulate. (I wish I had a nickel for every time I've said that here :-)
A computer simulation of a brain *would* however equal a brain in the special case that natural brains do in fact exist as computers. However real brains have semantics and it looks to me like real computers do not and cannot, so I do not equate natural brains with computers.
The computationalist theory of mind seems like a nifty idea, but I think it does not compute.
-gts
More information about the extropy-chat
mailing list