[ExI] The symbol grounding problem in strong AI

JOSHUA JOB nanite1018 at gmail.com
Fri Jan 1 16:08:30 UTC 2010


> If you expect to find consciousness in or stemming from a computer  
> simulation of a brain then I would suppose you might also expect to  
> eat a photo of a ham sandwich off a lunch menu and find that it  
> tastes like the ham sandwich it simulates. After all, on your logic  
> the simulation of the ham sandwich is implemented in the substrate  
> of the menu. But that piece of paper won't taste much like a ham  
> sandwich, now will it? And why not? Because, as I keep trying to  
> communicate to you, simulations of things do not equal the things  
> they simulate. Descriptions of things do not equal the things they  
> describe.
>
> -gts
I'll just jump in to say that this is a bad analogy, at best.  
Consciousness is not a thing in the world that makes things happen  
directly. Consciousness only effects the world by giving "directing"  
the body to do things. If your simulation of a ham sandwich can also  
interact with my taste buds exactly like a ham sandwich (akin to  
hooking up a simulation of the brain to a body through electro-neuro  
connections, etc.) then fine. But a really good photo isn't a perfect  
simulation, and it certainly cannot interact with the world in the way  
a sandwich actually does. Same thing with your other analogy about  
thunderstorms. A simulation of a thunderstorm can't make things wet in  
the real world because it is in a computer. But it can make the  
entities in the simulation "wet". And if you had a really complex  
machine that could make wind and distribute water molecules and have a  
big screen to show a photo of what the thunderstorm would look like  
from the ground, well then it could make things wet.

Divorcing the simulation from the world will prevent it from doing the  
things that the real thing would do. But if you connect it to the  
"real" world in a way that lets it do everything it would normally do  
(all the outputs from your simulation of the brain, for example,  
direct a body, and all the inputs from the body's senses go to the  
simulation), then it will do exactly what it normally does. So all you  
have to do is connect your simulation of a brain to a body, and it  
will be just like the actual brain.


Joshua Job
nanite1018 at gmail.com






More information about the extropy-chat mailing list