[ExI] GPT-4 on its inability to solve the symbol grounding problem
Gordon Swobe
gordon.swobe at gmail.com
Fri Apr 14 16:47:17 UTC 2023
On Thu, Apr 13, 2023 at 5:19 PM Giovanni Santostasi <gsantostasi at gmail.com>
wrote:
> I think the common understanding of referent is that certain words
> (not all for sure, and this is an important point) refer or point to
> certain objects in the real world.
If I wrote something like that about pointing to certain objects in the
real world then I might have confused you if you took me too literally.
When you point to an apple and say "this is an apple," you may or may not
literally be pointing your finger physically at the apple. Linguistically,
you are pointing to what you mean by "apple" and presumably the listener
understands what you mean.
You could be hallucinating the apple such that the listener has no idea
what you mean, but you know what you mean.
When an LLM sees the word "apple" in its training, there is no meaning
attached to the symbol.
-gts
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230414/53e9cc7f/attachment.htm>
More information about the extropy-chat
mailing list