[ExI] GPT-4 on its inability to solve the symbol grounding problem

Ben Zaiboc ben at zaiboc.net
Sat Apr 15 08:15:08 UTC 2023

I have a suggestion.

Instead of 'ground', try using the word 'associate'. That seems to me 
more useful. 'Grounding' implies that there is a single basis for the 
meaning of whatever is being 'grounded'. But we know that this can't be 
the case, e.g. my example of Smaug. Different people will create 
different associations for the word, depending on their prior knowlege 
of dragons, the story it appears in, images of dragons, or a specific 
image of this particular dragon, and loads of other associations. You 
can't say that 'Smaug' is 'grounded' in any single thing, even for one 
individual, never mind many, so using the term doesn't do justice to 
what is actually happening. I think it actually obscures what's 
happening, misleading us into assuming that a word can only be 
associated with one experience (or one 'real-world thing', if you prefer).

The same is true for things that actually do exist, like apples. There 
are many many apples, all different, and many many experiences people 
have associated with them. The word 'Apple' cannot possibly be based on 
one single thing, it's an abstraction built from many associations. 
Using the word 'grounded' obscures this fact.

Now I'm waiting for someone to say "but 'associating' is not the same 
thing as 'grounding'!". If I'm right, and 'someone' does indeed object, 
I'd be interested in their justification for this, seeing as 
associations is all we have to work with in any information-processing 
system, including the brain.

On the other hand, if there is no objection, why don't we give it a try? 
Drop the word 'grounding' altogether, use 'associating' instead.

For starters, the "symbol grounding problem" becomes "the symbol 
association problem".
Suddenly, it doesn't seem so much of a problem, does it?


More information about the extropy-chat mailing list