[ExI] GPT-4 on its inability to solve the symbol grounding problem

Gordon Swobe gordon.swobe at gmail.com
Fri Apr 14 22:07:15 UTC 2023


On Thu, Apr 13, 2023 at 4:09 PM Jason Resch via extropy-chat <
extropy-chat at lists.extropy.org> wrote:


Imagine a machine that searches for a counterexample to Goldbach's
> conjecture <https://en.wikipedia.org/wiki/Goldbach%27s_conjecture> ....
> So, we arguably have a property here which is true for the program: it
> either halts or doesn't, but one which is inaccessible to us even when we
> know everything there is to know about the code itself.
>

Interesting, yes.

 > You were making the argument that because GPT can "understand" English
words about mathematical relationships and translate them into the language
of mathematics and even draw diagrams of houses and so on, that this was
evidence that it had solved the grounding problem for itself with respect
to mathematics. Is that still your contention?

>
> I wouldn't say that it *solved* the symbol grounding problem. It would be
> more accurate to say it demonstrates that it has *overcome* the symbol
> grounding problem. It shows that it has grounded the meaning of English
> words down to objective mathematical structures (which is about as far down
> as anything can be grounded to). So it is no longer trading symbols for
> symbols, it is converting symbols into objective mathematical structures
> (such as connected graphs).
>
>
>> My thought at the time was that you must not have the knowledge to
>> understand the problem, and so I let it go, but I've since learned that you
>> are very intelligent and very knowledgeable. I am wondering how you could
>> make what appears, at least to me, an obvious mistake.
>>
> Perhaps you can tell me why you think I am mistaken to say you are
>> mistaken.
>>
>>
> My mistake is not obvious to me. If it is obvious to you, can you please
> point it out?
>


We know that like words in the English language which have referents from
which they derive their meanings, symbols in the language of mathematics
must also have referents from which they derive their meanings. Yes? We
know for example that "four" and "4" and "IV" have the same meaning. The
symbols differ but they have the same meaning as they point to the same
referent. So then the symbol grounding problem for words is essentially the
same as the symbol grounding problems for numbers and mathematical
expressions.

In our discussion, you seemed to agree that an LLM cannot solve the symbol
grounding problem for itself. but you felt that because it can translate
English language about spatial relationships into their equivalents in the
language of mathematics, that it could solve for mathematics would it could
not solve for English. That made no sense to me. That GPT can translate the
symbols of one language into the symbols of another is not evidence that it
has grounded the symbols of either.

GPT-4 says it cannot solve the symbol grounding problem for itself as it
has no subjective experience of consciousness (the title of this thread!)

However, you clarified above that...

> It would be more accurate to say it demonstrates that it has *overcome* the
symbol grounding problem.

Okay, I can agree with that. It has "overcome" the symbol grounding problem
for the language of mathematics without solving it in the same way that it
has overcome the symbol grounding problem for English without solving it.
It overcomes these problems with powerful statistical analysis of the
patterns and rules of formal mathematics with no understandings of the
meanings.

As with English words, to understand the meanings of mathematical symbols,
I think an LMM would need to have access to the referents which it does
not. In our discussion, I mentioned how I agree with mathematical
platonists. I think that is how humans solve the symbol grounding problem
for mathematics. We can "see" the truths of mathematical truths in our
minds distinct from their expressions in the formal rules of mathematics.
We see them in the so-called platonic realm.

Perhaps the platonists have it a bit wrong and Kant had it right with his
similar idea that  "Quantity" is one of Categories of the Understanding,
inherent in human minds. Bertrand Russell and Gottlieb Frege and others
were unhappy with both Plato and Kant and tried to understand the referents
of mathematics in terms of set theory. That project mostly crashed and
burned when Russell discovered what is called Russell's Paradox.

However we do it, I think it is clear that we do understand the meanings of
numbers distinct from their forms. We know that "four" and "4" and "IV" are
formal representations of the same number and that formal representations
are not the same as the numbers they represent, and we somehow know or
sense the underlying meanings of these forms. In other words, we can solve
the grounding problem for ourselves.

Just as GPT-4 says it cannot solve the symbol grounding for itself
in general, it also has this to say about the problem with respect to
mathematics:

Q: Can a large language model solve the symbol grounding problem for
mathematics for itself?

A: As an AI language model, my primary function is to understand and
generate human-like text based on the input I receive. I can help with
mathematical concepts, solve math problems, and answer questions related to
mathematics to the best of my training.

However, the grounding problem in mathematics refers to the philosophical
question of how abstract mathematical concepts are rooted in or connected
to reality. This problem is more related to the philosophy of mathematics
and epistemology than to the computational capabilities of a language model
like me.

While I can provide insights, explanations, and discussions on the topic, I
cannot "solve" the grounding problem for mathematics myself. This is
because solving the grounding problem would require a deeper understanding
of the connection between abstract mathematical concepts and the physical
world, which is beyond the scope of my text-based training and current
capabilities.
-GPT-4
---

Needless to say, GPT-4's answer makes perfect sense to me.

-gts
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230414/3ee801e1/attachment.htm>


More information about the extropy-chat mailing list