[ExI] The symbol grounding problem in strong AI

John Clark jonkc at bellsouth.net
Sat Jan 2 05:44:07 UTC 2010

On Jan 1, 2010, Gordon Swobe wrote:

> In my opinion you fall off the rails there and wander into the land of metaphysical dualism.

It may be dualism to say that what a thing is and what a thing does are not the same, but it's not metaphysical it's just logical. For example, saying mind is what a brain does is no more metaphysical than saying going fast is what a racing car does.

> you on the other hand should describe yourself as such given that you believe we can get intentional entities from running programs.

Intentional means calculable, and calculable sounds to me to be something programs should be rather good at. 

> The conventional strong AI research program is based on that same false premise

There is no such thing as strong AI research, there is just AI research. Nobody is doing Artificial Consciousness research because claiming success would be just too easy. 
> If you expect to find consciousness in or stemming from a computer simulation of a brain then I would suppose you might also expect to eat a photo of a ham sandwich off a lunch menu and find that it tastes like the ham sandwich it simulates.

I haven't actually tried to do it but I don't believe that would work very well. It's just a hunch.

> After all, on your logic the simulation of the ham sandwich is implemented in the substrate of the menu. But that piece of paper won't taste much like a ham sandwich, now will it?


> And why not?

Because a ham sandwich is a noun and a photo of one is a very different noun and consciousness is not even a noun at all.

> What you cannot or refuse to see is that a formal program simulating the brain cannot cause consciousness

So you and Searle keep telling us over and over and over again, but Gordon, my problem is that I think Charles Darwin was smarter than either one of you. And the fossil record also thinks Darwin was smarter.

 John K Clark 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20100102/663b7495/attachment.html>

More information about the extropy-chat mailing list