[ExI] GPT-4 on its inability to solve the symbol grounding problem

Gordon Swobe gordon.swobe at gmail.com
Wed Apr 5 00:01:59 UTC 2023


>
>
> > It's passive-aggressive.
>

I'm sorry if I come across that way. It is not intentional. I ignore some
counterpoints simply on account of I don't have the time to get bogged down
in all the excruciating details. Been there, done that. Also I think Brent
addressed many of your points.

My point in this thread is that GPT-4, arguably the most advanced AI on the
planet right now, denies that it has consciousness and denies that it has
true understanding of the world or of the meanings of words. It says it
knows only about the patterns and statistical relationships between words,
which is exactly what I would expect it to say given that it was trained on
the forms of words and not their meanings.

-gts
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230404/ae3d9c01/attachment.htm>


More information about the extropy-chat mailing list