[ExI] e: GPT-4 on its inability to solve the symbol grounding problem

Jason Resch jasonresch at gmail.com
Sun Apr 16 20:04:53 UTC 2023


On Sun, Apr 16, 2023, 3:28 PM Gordon Swobe <gordon.swobe at gmail.com> wrote:

>
> On Sun, Apr 16, 2023 at 12:06 PM Jason Resch <jasonresch at gmail.com> wrote:
>
>>
>>
>> On Sun, Apr 16, 2023 at 12:39 PM Gordon Swobe <gordon.swobe at gmail.com>
>> wrote:
>>
>>> In other words, the language of mathematics is just one of many
>>> languages that it “understands,” the scare quotes to indicate that because
>>> it has no insight into the world outside of language and symbols, it cannot
>>> ground the symbols which is what we normally mean by conscious
>>> understanding.
>>>
>>>>
>>>>
>> It "grounds" (I put that in square quote because there is never any
>> direct connection with the objects themselves) the meaning to the patterns
>> inherent within the set of symbols themselves. Do you acknowledge that
>> these patterns exist?
>>
>
> Good that you put "grounds" in scare quotes because that is not what the
> word means. Yes, GPT-4 finds patterns, including the pattern 1, 2, 3, 4...
>
> In fact, finding patterns is exactly what it excels at doing both in math
> and in English. It "knows" that after the symbols "1", "2", "3", "4," the
> probability of the next symbol being "5" is nearly 100%, and so that is how
> it continues the prompt.
>
> Incidentally, as you might know, the answers to questions posed to GPT are
> technically called *continuations* because that is exactly what they are:
> continuations as in auto-conplete continuations.
>
> To ground the symbol "two" or any other number -- to truly understand that
> the sequence is a sequence of numbers and what are numbers -- it needs
> access to the referents of numbers which is what the symbol grounding
> problem is all about. The referents exist outside of the language of
> mathematics.
>

But they aren't outside the patterns within language and the corpus of text
it has access to. Consider GPT having a sentence like:
 "This sentence has five words."

Can the model not count the words in a sentence like a child can count
pieces of candy? Is that sentence not a direct referent/exemplar for a set
of cardinality of five?



> Can something (anything?) studying/analyzing these patterns to learn about
>> (or possibly even understand) them?
>>
>
> We do.
>

But AI can't because...?
(Consider the case of Hellen Keller in your answer)

I explained how and why I think we do and you agreed with me, stating that
> along with me, you sided with mathematical platonists. Kantians have a
> similar answer to how it is that humans ground mathematical symbols, as I
> also mentioned.
>

I think the question of Platonism is somewhat independent of the question
of how humans learn to understand math though.


Jason

>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230416/7ea33fe2/attachment.htm>


More information about the extropy-chat mailing list