[ExI] GPT-4 on its inability to solve the symbol grounding problem

Gordon Swobe gordon.swobe at gmail.com
Fri Apr 7 17:24:04 UTC 2023


On Fri, Apr 7, 2023 at 6:49 AM Jason Resch via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> The use of square quotes then, is merely an escape to say it is acting as
if it understands without understanding, or it is acting like it knows
without knowing.

Yes, but that is your idea to call it an escape. I mean it as shorthand.
Instead of saying...

"GPT is an ingeniously designed software application programmed by its
developers to respond to prompts by predicting the next word, and the next
and the next, based on the statistics of how humans use words. It performs
this function exceedingly well, so much so that people are fooled into
thinking it actually knows the meanings of the sentences and paragraphs
that it generates."

I can write simply that GPT "knows" how to write sentences and paragraphs
that humans find meaningful. I am trusting the reader not to take my word
"knows" literally.

-gts
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230407/834fea89/attachment.htm>


More information about the extropy-chat mailing list