[ExI] GPT-4 on its inability to solve the symbol grounding problem

Gordon Swobe gordon.swobe at gmail.com
Fri Apr 7 06:33:12 UTC 2023


On Fri, Apr 7, 2023 at 12:16 AM Jason Resch via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

I believe that if GPT relaly believes it is not conscious, then it must be
conscious, as one has to be conscious in order to believe anything.
Likewise one has to be conscious to know. You said it "knows how it was
itself designed". You also said that GPT "understands" AI. To me, knowing,
understanding, and believing all imply consciousness, just as much as
feeling, perceiving, and thinking do.

As I wrote in this thread a couple of days ago, I need to remember to put
those terms in scare quotes lest I appear to be making the same mistake I
am railing against. I do not believe that GPT actually believes or knows or
understands anything whatsoever, but "believes" and "knows" and
"understands" are convenient shorthand.

-gts
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230407/f448ee5d/attachment.htm>


More information about the extropy-chat mailing list