[ExI] GPT-4 on its inability to solve the symbol grounding problem

Brent Allsop brent.allsop at gmail.com
Sat Apr 8 03:51:36 UTC 2023


On Fri, Apr 7, 2023 at 9:46 PM Gordon Swobe via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> On Fri, Apr 7, 2023 at 5:38 PM Jason Resch via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> On Fri, Apr 7, 2023, 7:25 PM Gordon Swobe <gordon.swobe at gmail.com> wrote:
>>
>>> On Fri, Apr 7, 2023 at 1:44 PM Jason Resch via extropy-chat <
>>> extropy-chat at lists.extropy.org> wrote:
>>>
>>
>
> Note: I did not say the software is alive, nor did I say it was conscious
>> like humans are. Only that things that demonstrate awareness of something
>> we can assume to be conscious of something.
>>
> How, then, do you define conscious?  Certainly you don't think there is
anything phenomenal, like redness and greenness in there, like our
phenomenal consciousness that is like something?

This makes the conscious LLM claim trivial and uninteresting. My Ring
>> doorbell in my smart home is “aware” of motion in front my door and starts
>> shooting video. Excuse me, but so what.
>>
>
Abstractly aware, yes.  Now if it was phenomenally aware of  motion, and
could experience colorness qualities, representing what it is detecting,
now THAT would be defined as phenomenally conscious.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230407/656cdca8/attachment.htm>


More information about the extropy-chat mailing list