[ExI] e: GPT-4 on its inability to solve the symbol grounding problem

Gordon Swobe gordon.swobe at gmail.com
Sun Apr 16 17:39:29 UTC 2023


In other words, the language of mathematics is just one of many languages
that it “understands,” the scare quotes to indicate that because it has no
insight into the world outside of language and symbols, it cannot ground
the symbols which is what we normally mean by conscious understanding.

-gts

On Sun, Apr 16, 2023 at 11:19 AM Gordon Swobe <gordon.swobe at gmail.com>
wrote:

> On Sun, Apr 16, 2023 at 6:39 AM Jason Resch via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>
> Do you at least agree there's sufficient information in text to learn the
>> meaning of the word 'two'?
>>
>
> If you enter the words or the numerals for one, two, three, four and ask
> GPT-4 to continue, it will dutifully, like a good autocomplete algorithm,
> continue the series with five, six, seven, eight, and so on. To you, this
> is probably evidence that it consciously understands the meaning of “two,”
> but it is not true conscious understanding of numbers and GPT-4 will be the
> first to admit it.
>
> This goes back to the discussion of the symbol grounding problem in
> mathematics that we just had just a day or so ago. Apparently, nothing I
> wrote there registered.
>
>
> -gts
>
>
>
>
>
>
> Spend 10 minutes putting yourself in the shoes of someone in a Korean
>>> library (with no pictures or translations) given thousands of years to
>>> figure out what any of the symbols mean.
>>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230416/a70750b0/attachment.htm>


More information about the extropy-chat mailing list