[ExI] e: GPT-4 on its inability to solve the symbol grounding problem

Gordon Swobe gordon.swobe at gmail.com
Mon Apr 17 22:31:20 UTC 2023

On Mon, Apr 17, 2023 at 4:19 PM Giovanni Santostasi <gsantostasi at gmail.com>

> But Gordon if it just "knows" then I should not listen right? It doesn't
> really know it just "knows" according to you.

It "knows" which is another way to say I think it has knowledge the way my
watch "knows" the time and my smart doorbell "knows" when there is movement
outside my door. It lacks consciousness, but this is not the same as saying
it does behave in seemingly intelligent ways that we find useful. It's a
tool, and it's pretty sad when people start anthropomorphizing their tools.

Something I've mentioned only tangentially is that it is not GPT-4 that
deserves credit or applause. The software engineers who developed it
deserve all the credit.

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230417/9c904573/attachment.htm>

More information about the extropy-chat mailing list