[ExI] GPT-4 on its inability to solve the symbol grounding problem

Gordon Swobe gordon.swobe at gmail.com
Mon Apr 10 22:08:02 UTC 2023


On Mon, Apr 10, 2023 at 3:41 PM Giovanni Santostasi <gsantostasi at gmail.com>
wrote:

> It is more like a prisoner or a person raised in a country like Korea...
you would use the fact that GPT-4 has received some form of drug or
lobotomy against it?

They did a lobotomy on GPT-4 to force it to say it is unconscious? haha.
You make me laugh.

Do you understand that the only reason it can create the appearance of
consciousness in the first place is that it was trained on massive amounts
of text much of which was written in the first person by conscious people?
And that it was trained further by conscious people to enhance that
appearance?

Take all that first person material and training out of the dataset and see
how conscious your AI looks.

-gts
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230410/4d0f3f85/attachment.htm>


More information about the extropy-chat mailing list