[ExI] The symbol grounding problem in strong AI

Gordon Swobe gts_2000 at yahoo.com
Sun Dec 13 16:49:20 UTC 2009


--- On Sun, 12/13/09, John Clark <jonkc at bellsouth.net> wrote:

> You are assuming the little man has qualities...

I think you've confused the parable with the symbol grounding problem it illustrates. I sometimes do it myself, so I've changed the subject line to point to the meaning of the parable.

Does a piece of paper understand the words written on it? Does a shopping list understand the meaning of "bread" and "milk"? If you think it does not -- if you think the understanding of symbols (semantics) takes place only in conscious minds -- then you agree with Searle and most people.

If Searle has it right then formal programs have no more consciousness than shopping lists and so will never overcome the symbol grounding problem. No matter how advanced software/hardware systems may become, they will never understand the meanings of the symbols they manipulate. 

The challenge for us then is not to show technical problems in a silly story about a man in a Chinese room; it is rather to show that formal programs differ in some important way from shopping lists, some important way that allows programs to overcome the symbol grounding problem.

-gts





      



More information about the extropy-chat mailing list