[ExI] e: GPT-4 on its inability to solve the symbol grounding problem
gordon.swobe at gmail.com
Sun Apr 16 07:44:11 UTC 2023
To put that another way, just as it can know the forms of words but not
their meanings, it can know the form of grammar but not its meaning. I have
no doubt that this goes a long way toward helping it write sensibly. The
forms of grammar are part of what is meant by the patterns of language.
On Sun, Apr 16, 2023 at 1:03 AM Gordon Swobe <gordon.swobe at gmail.com> wrote:
> On Sat, Apr 15, 2023 at 10:47 PM Jason Resch via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
> I disagree completely.
> The LLM can learn patterns, yes, (this is part of what GPT does and
> reports that it does) but no, it cannot learn what the parts of speech
> mean, nor does it claim to know the parts of speech.
> > Numbers are often listed in a certain sequence (e.g. in ascending
> order), often used before nouns (can infer it gives a count).
> What is a number? What is a noun? A person, place or thing, you say? What
> is a person, a place, or a thing?
> > Words that can stand alone as sentences are verbs.
> What is a verb? Action? What is action? For that matter, what is a
> Likewise with all other parts of speech. Even if it can classify every
> noun as belonging to a certain set of symbols X, and every verb as
> belonging to another set of symbols Y, it could still never know what is a
> noun or a verb. It can know only the pattern of how these classes of
> symbols tend to appear together .
> GPT-4 can learn only the patterns and relationships between and among
> word-symbols with no knowledge of the meanings of the individual words, *exactly
> as it reports that it does*. It does this extremely well, and it is in
> this way that it can *simulate* human understanding.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat