[ExI] GPT-4 on its inability to solve the symbol grounding problem

Jason Resch jasonresch at gmail.com
Sat Apr 8 20:55:38 UTC 2023

On Sat, Apr 8, 2023, 3:06 PM Gordon Swobe <gordon.swobe at gmail.com> wrote:

> On Sat, Apr 8, 2023 at 12:44 PM Jason Resch via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>> It has awareness in my opinion, as it has demonstrated a perception of
>> the words I feed it by virtue of crafting a sensible reply in response.
>> You can deny that awareness only if you find it consistent to speak of an
>> "unaware awareness."
> Everything can be said to be aware of something. I mentioned that the
> doorbell in my smart home becomes aware of motion outside my door and
> starts shooting video. Just as the words that GPT-4 generates are helpful
> and meaningful to me, so too is the fact that my doorbell is helping me
> look out for thieves and vandals, but it is trivial and uninteresting.

Do we agree then that your doorbell system has some trivial degree of

If yes then that is progress. Perhaps you just have a very different
definition of consciousness than I have. How do you define it?


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230408/0de8fa07/attachment.htm>

More information about the extropy-chat mailing list