[ExI] e: GPT-4 on its inability to solve the symbol grounding problem
jasonresch at gmail.com
Sun Apr 16 20:04:53 UTC 2023
On Sun, Apr 16, 2023, 3:28 PM Gordon Swobe <gordon.swobe at gmail.com> wrote:
> On Sun, Apr 16, 2023 at 12:06 PM Jason Resch <jasonresch at gmail.com> wrote:
>> On Sun, Apr 16, 2023 at 12:39 PM Gordon Swobe <gordon.swobe at gmail.com>
>>> In other words, the language of mathematics is just one of many
>>> languages that it “understands,” the scare quotes to indicate that because
>>> it has no insight into the world outside of language and symbols, it cannot
>>> ground the symbols which is what we normally mean by conscious
>> It "grounds" (I put that in square quote because there is never any
>> direct connection with the objects themselves) the meaning to the patterns
>> inherent within the set of symbols themselves. Do you acknowledge that
>> these patterns exist?
> Good that you put "grounds" in scare quotes because that is not what the
> word means. Yes, GPT-4 finds patterns, including the pattern 1, 2, 3, 4...
> In fact, finding patterns is exactly what it excels at doing both in math
> and in English. It "knows" that after the symbols "1", "2", "3", "4," the
> probability of the next symbol being "5" is nearly 100%, and so that is how
> it continues the prompt.
> Incidentally, as you might know, the answers to questions posed to GPT are
> technically called *continuations* because that is exactly what they are:
> continuations as in auto-conplete continuations.
> To ground the symbol "two" or any other number -- to truly understand that
> the sequence is a sequence of numbers and what are numbers -- it needs
> access to the referents of numbers which is what the symbol grounding
> problem is all about. The referents exist outside of the language of
But they aren't outside the patterns within language and the corpus of text
it has access to. Consider GPT having a sentence like:
"This sentence has five words."
Can the model not count the words in a sentence like a child can count
pieces of candy? Is that sentence not a direct referent/exemplar for a set
of cardinality of five?
> Can something (anything?) studying/analyzing these patterns to learn about
>> (or possibly even understand) them?
> We do.
But AI can't because...?
(Consider the case of Hellen Keller in your answer)
I explained how and why I think we do and you agreed with me, stating that
> along with me, you sided with mathematical platonists. Kantians have a
> similar answer to how it is that humans ground mathematical symbols, as I
> also mentioned.
I think the question of Platonism is somewhat independent of the question
of how humans learn to understand math though.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat