[ExI] GPT-4 on its inability to solve the symbol grounding problem

William Flynn Wallace foozler83 at gmail.com
Wed Apr 5 21:13:22 UTC 2023


If a possum did not have the abstract idea of an apple, then only the first
apple he saw would be regarded as an apple.  All animals abstract and
generalize.   bill w

On Wed, Apr 5, 2023 at 3:05 PM Giovanni Santostasi via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> Gordon,
> In fact, now that I'm thinking about it, it is the exact opposite of what
> you say. Referents are not just not necessary for the language but because
> of language we can actually make the association between abstract ideas in
> our head and the object in the external world. We can associate a physical
> apple with an apple because we are able to abstract in the first place that
> is what is the real essence of language. Abstraction is the ability to
> extract essential properties of an event, object, or another abstract idea
> beyond the immediate physical characteristics of the object of abstraction.
> This is what we do when we see 1 apple and say 1 or 1 apple and 1 orange
> and say 2.
> I would say that language allows to actually recognize objects in the
> world as objects in a given category or give them names or qualities. You
> can still perceive an apple as something, you can smell it and taste it and
> maybe a lower animal can associate an apple with something good to eat but
> it would not be able to do the association with a given word or idea
> because it cannot do the abstraction to a general concept of an apple. That
> is what language is about, that is the opposite of what you claim. Without
> language (creating abstract ideas and generalization in our head) there is
> no object to refer to, not the other way around.
>
> Giovanni
>
>
>
>
> On Wed, Apr 5, 2023 at 12:29 PM Giovanni Santostasi <gsantostasi at gmail.com>
> wrote:
>
>> Gordon,
>> you say: By referents, I mean the things and ideas outside of language
>> to which words point. If you hold an apple in your hand and say "this is an
>> apple," the apple is the referent that gives your word "apple" meaning.
>>
>> Absolutely not. This is not how language works.
>> It takes a long time for a child, that is strongly wired to learn
>> language, to understand what you mean when you point to them an apple and
>> say "apple". It also requires a certain level of brain development.
>> Teaching children colors is even more difficult and requires more time. The
>> difficulty is exactly the opposite of what you are saying is the essence
>> and importance of having referents. It is all in the ABSTRACTION that is
>> needed to actually make the association.
>>
>> This has been pointed out to you many times (also to Brent with its
>> insistence on quality of redness nonsense). It takes time to make the
>> association between what an adult calls an apple and what a child sees.
>>
>> What is the essence of an apple? It is being round? Being a round eatable
>> object (so different from a round ball)? What about an orange? That is
>> another round eatable object, but it is not an apple because... What about
>> an apple in a picture vs a real apple? What about our dog called Apple? You
>> understand what I'm trying to express. It is not as easy as you think to
>> associate the apple with an object because it is a complex process that has
>> basically almost nothing to do with the referent itself. The referent plays
>> very little role and it is not at all what gives language meaning and
>> power. It is all in the ABSTRACTIONS, all the relationships at higher
>> levels (in fact statistical ones that we calculate approximately in our
>> brain).
>>
>> This is why we can give meaning to things that are abstract in the first
>> place like love or meaning itself.
>> This is why we can imagine dragons, flying pigs, and so on. This is why
>> languages can be bootstrapped from a single axiom or definition (even an
>> arbitrary one) as one does with the null set in mathematics.
>>
>> I have looked for somebody writing a paper on how one can bootstrap an
>> entire language from something similar to the null set, it is probably
>> somewhere there but if not one day I will try it myself. But mathematics
>> derived from the null set is at least a counterexample to your statement
>> that language needs referents for meaning to emerge.
>>
>> Also one has to be clever on how to use GPT-4 on these topics.
>> Instead of asking if it is conscious or understands language do tests to
>> see if it does.
>>
>> One test I did was to ask to imagine a conversation between beings in
>> different dimensions that don't even share the same laws of physics let
>> alone common possible referents like chemical elements or things like rocks
>> or stars. It gave me a very interesting example of using a series of 0s and
>> 1s in a given sequence to let the other entity know they understood similar
>> and different, following a sequence in time, yes, no, and so on. It was an
>> incredibly fascinating example because it shows how you could communicate
>> with another being with almost no referents in common and needing just a
>> few fundamental abstract ideas as different and similar that don't need any
>> rocks to be defined. One can see that once you establish, "I'm here", "I
>> understand", "Yes", "No", "same", and "different" one can little by little
>> build an entire language with basically no physical referents.
>> GPT-4 came up with that.
>>
>> So you are simply wrong Gordon. You have an example above from GPT-4 that
>> shows referents may be useful for survival in biological beings like us but
>> they are completely unnecessary for language and meaning.
>> The case should be closed.
>> Giovanni
>>
>>
>>
>>
>>
>>
>>
>>
>> On Wed, Apr 5, 2023 at 7:20 AM BillK via extropy-chat <
>> extropy-chat at lists.extropy.org> wrote:
>>
>>> On Wed, 5 Apr 2023 at 14:20, spike jones via extropy-chat
>>> <extropy-chat at lists.extropy.org> wrote:
>>> >
>>> > From: extropy-chat <extropy-chat-bounces at lists.extropy.org> On Behalf
>>> Of Jason Resch via extropy-chat
>>> > >…This is a phenomenon we are all subject to and which we should all
>>> be aware of called cognitive dissonance. It can occur whenever our brains
>>> encounter information perceived as threatening to our existing beliefs
>>> …Jason
>>> >
>>> > Ja.  In our world today, we are in a culture war in which many of our
>>> most fundamental beliefs are being challenged.  Those with the most
>>> cognitive dissonance see offense in what looks like perfectly innocuous
>>> observations to those who have little if any cog-dis.  Thx Jason.
>>> >
>>> > spike
>>> > _______________________________________________
>>>
>>>
>>>
>>> No problem.   It just takes a bit of practice.  :)
>>>
>>> Quote:
>>> “Alice laughed. 'There's no use trying,' she said. 'One can't believe
>>> impossible things.'
>>>
>>> I daresay you haven't had much practice,' said the Queen. 'When I was
>>> your age, I always did it for half-an-hour a day. Why, sometimes I've
>>> believed as many as six impossible things before breakfast!”
>>> ― Lewis Carroll
>>> ---------------
>>>
>>> BillK
>>>
>>> _______________________________________________
>>> extropy-chat mailing list
>>> extropy-chat at lists.extropy.org
>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>
>> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230405/1cb4eed7/attachment.htm>


More information about the extropy-chat mailing list