[ExI] GPT-4 on its inability to solve the symbol grounding problem
gordon.swobe at gmail.com
Fri Apr 14 23:53:11 UTC 2023
On Fri, Apr 14, 2023 at 3:12 PM Giovanni Santostasi <gsantostasi at gmail.com>
I have a really hard time imagining how this is derived from a simple
> autocomplete operation.
Obviously it is not a “simple” autocomplete operation. It is a highly
complex and sophisticated autocomplete operation, unlike anything the world
has ever seen, derived from statistical analysis of massive amounts of
text, but autocomplete is essentially what it is doing, predicting one
token after another after another…
I don’t dispute that emergent properties might account for some of the
uncanny things these models can do, but I think conscious understanding of
the meanings of the words and sentences they generate with no possible
access to the referents/meanings is something else. The forms of words do
not contain the seeds of their meanings.
> On Fri, Apr 14, 2023 at 1:46 PM Gordon Swobe <gordon.swobe at gmail.com>
>> I’ll bet if you ask it to draw a perfect circle, it will draw one without
>> ever having “seen” one. It should have learned from the language about
>> circles including the language of mathematics of circles how to draw one.
>> Is that really so amazing?
>> On Fri, Apr 14, 2023 at 2:17 PM Gordon Swobe <gordon.swobe at gmail.com>
>>> On Fri, Apr 14, 2023 at 1:54 PM Giovanni Santostasi <
>>> gsantostasi at gmail.com> wrote:
>>>> I showed you the different pics GPT-4 can create given nonvisual
>>>> training. How can it draw an apple and know how to distinguish it from a
>>> These tasks have nothing to do with the statistical properties of words
>>>> given they are spatial tasks and go beyond verbal communication. How do you
>>>> explain all this?
>>> They *do* have to do with the statistical properties of words and
>>> symbols and the relations and patterns between them. The shapes of pears
>>> and apples (and eyes etc) are describable and distinguishable in the
>>> language of mathematics.
>>> I agree it is amazing, but the “meaning” is something we assign to the
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat