[ExI] GPT-4 on its inability to solve the symbol grounding problem

Jason Resch jasonresch at gmail.com
Sun Apr 9 14:30:21 UTC 2023


On Sat, Apr 8, 2023 at 9:53 PM Giovanni Santostasi via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

>
> The presence of these self-referential loops is where consciousness
> resides and the presence of the loops alone makes it the opposite of what
> you call direct. It is a recursive and highly nonlinear process. I may
> agree that some form of "expert" intelligence doesn't require consciousness
> (like a smart bomb).
> But real AGI would require consciousness of some level (in my definition
> recurrent loops that alert the system of its own states).
> Giovanni
>

Might this be a possible definition for consciousness: A system capable
of reacting to changes of its own state?

Jason
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230409/4b180397/attachment.htm>


More information about the extropy-chat mailing list