[ExI] GPT-4 on its inability to solve the symbol grounding problem

Gordon Swobe gordon.swobe at gmail.com
Fri Apr 14 20:46:47 UTC 2023


I’ll bet if you ask it to draw a perfect circle, it will draw one without
ever having “seen” one. It should have learned from the language about
circles including the language of mathematics of circles how to draw one.
Is that really so amazing?

-gts


On Fri, Apr 14, 2023 at 2:17 PM Gordon Swobe <gordon.swobe at gmail.com> wrote:

>
>
> On Fri, Apr 14, 2023 at 1:54 PM Giovanni Santostasi <gsantostasi at gmail.com>
> wrote:
>
>>
>> I showed you the different pics GPT-4 can create given nonvisual
>> training. How can it draw an apple and know how to distinguish it from a
>> pear…
>>
> These tasks have nothing to do with the statistical properties of words
>> given they are spatial tasks and go beyond verbal communication. How do you
>> explain all this?
>>
>
>
> They *do* have to do with the statistical properties of words and symbols
> and the relations and patterns between them. The shapes of pears and apples
> (and eyes etc) are describable and distinguishable in the language of
> mathematics.
>
> I agree it is amazing, but the “meaning” is something we assign to the
> output.
>
> -gts
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230414/ded8e367/attachment.htm>


More information about the extropy-chat mailing list