[ExI] GPT-4 on its inability to solve the symbol grounding problem

Brent Allsop brent.allsop at gmail.com
Fri Apr 14 20:19:12 UTC 2023


I"m with Giovanni in this.
It's abstract knowledge, but very powerful and intelligent knowledge, that
is able to model, predict, reason about and a whole lot more dealing with a
great many things, including descriptions of stuff in the real world.
But this is still very different from phenomenal, like we have, composed of
qualities like redness and greenness.


On Fri, Apr 14, 2023 at 1:55 PM Giovanni Santostasi via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> Gordon,
> I showed you the different pics GPT-4 can create given nonvisual training.
> How can it draw an apple and know how to distinguish it from a pear if it
> has no meaning for these words? How can it put a bowl on top of a table if
> it doesn't understand above or below? How can it put eyes on a face of a
> human if it doesn't understand what eyes are and where they are located in
> a human face? How all this is possible without meaning? These tasks have
> nothing to do with the statistical properties of words given they are
> spatial tasks and go beyond verbal communication. How do you explain all
> this?
> Giovanni
>
> On Fri, Apr 14, 2023 at 9:47 AM Gordon Swobe <gordon.swobe at gmail.com>
> wrote:
>
>> On Thu, Apr 13, 2023 at 5:19 PM Giovanni Santostasi <
>> gsantostasi at gmail.com> wrote:
>>
>>  > I think the common understanding of referent is that certain words
>>
>>> (not all for sure, and this is an important point) refer or point to
>>> certain objects in the real world.
>>
>>
>> If I wrote something like that about pointing to certain objects in the
>> real world then I might have confused you if you took me too literally.
>> When you point to an apple and say "this is an apple," you may or may not
>> literally be pointing your finger physically at the apple. Linguistically,
>> you are pointing to what you mean by "apple" and presumably the listener
>> understands what you mean.
>>
>> You could be hallucinating the apple such that the listener has no idea
>> what you mean, but you know what you mean.
>>
>> When an LLM sees the word "apple" in its training, there is no meaning
>> attached to the symbol.
>>
>> -gts
>>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230414/c38b5954/attachment-0001.htm>


More information about the extropy-chat mailing list