[ExI] e: GPT-4 on its inability to solve the symbol grounding problem

Gordon Swobe gordon.swobe at gmail.com
Sun Apr 16 07:03:13 UTC 2023


On Sat, Apr 15, 2023 at 10:47 PM Jason Resch via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

I disagree completely.

The LLM can learn patterns, yes, (this is part of what GPT does and reports
that it does) but no, it cannot learn what the parts of speech mean, nor
does it claim to know the parts of speech.

> Numbers are often listed in a certain sequence (e.g. in ascending order),
often used before nouns (can infer it gives a count).

What is a number? What is a noun?  A person, place or thing, you say? What
is a person, a place, or a thing?

> Words that can stand alone as sentences are verbs.

What is a verb? Action? What is action? For that matter, what is a sentence?

Likewise with all other parts of speech. Even if it can classify every noun
as belonging to a certain set of symbols X, and every verb as belonging to
another set of symbols Y, it could still never know what is a noun or a
verb. It can know only the pattern of how these classes of symbols tend to
appear together .

GPT-4 can learn only the patterns and relationships between and among
word-symbols with no knowledge of the meanings of the individual
words, *exactly
as it reports that it does*. It does this extremely well, and it is in this
way that it can *simulate* human understanding.


-gts
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230416/923bcc51/attachment.htm>


More information about the extropy-chat mailing list