[ExI] e: GPT-4 on its inability to solve the symbol grounding problem

Gordon Swobe gordon.swobe at gmail.com
Sun Apr 16 17:19:22 UTC 2023


On Sun, Apr 16, 2023 at 6:39 AM Jason Resch via extropy-chat <
extropy-chat at lists.extropy.org> wrote:


Do you at least agree there's sufficient information in text to learn the
> meaning of the word 'two'?
>

If you enter the words or the numerals for one, two, three, four and ask
GPT-4 to continue, it will dutifully, like a good autocomplete algorithm,
continue the series with five, six, seven, eight, and so on. To you, this
is probably evidence that it consciously understands the meaning of “two,”
but it is not true conscious understanding of numbers and GPT-4 will be the
first to admit it.

This goes back to the discussion of the symbol grounding problem in
mathematics that we just had just a day or so ago. Apparently, nothing I
wrote there registered.


-gts






Spend 10 minutes putting yourself in the shoes of someone in a Korean
>> library (with no pictures or translations) given thousands of years to
>> figure out what any of the symbols mean.
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230416/52dcd69e/attachment.htm>


More information about the extropy-chat mailing list