[ExI] e: GPT-4 on its inability to solve the symbol grounding problem

Gordon Swobe gordon.swobe at gmail.com
Wed Apr 19 06:43:43 UTC 2023

Ben, I can't locate the message, but you asked my thoughts on the
difference between a language model solving what you have called the word
association problem and its solving the symbol grounding problem. In my
view, the difference lies in the fact that understanding statistical
associations between words does not require knowledge of their meanings.
While this distinction might not make a practical difference, it becomes
important if the question is whether the model genuinely understands the
content of its inputs and outputs or merely simulates that understanding.


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230419/1029c047/attachment.htm>

More information about the extropy-chat mailing list