[ExI] GPT-4 on its inability to solve the symbol grounding problem
Gordon Swobe
gordon.swobe at gmail.com
Sat Apr 8 16:25:14 UTC 2023
On Sat, Apr 8, 2023 at 9:31 AM Jason Resch <jasonresch at gmail.com> wrote:
>
>
> On Sat, Apr 8, 2023, 10:45 AM Gordon Swobe <gordon.swobe at gmail.com> wrote:
>
>>
>> On Sat, Apr 8, 2023 at 3:43 AM Jason Resch via extropy-chat <
>> extropy-chat at lists.extropy.org> wrote:
>>
>>
>>> There is phenomenal consciousness. That I would call awareness of first
>>> person non-sharable information concerning one's internal states of mind.
>>>
>>
>> It is this phenomenal consciousness to which I refer. If you do not think
>> there something it is like to be a large language model then we have no
>> disagreement.
>>
>
> I believe there is something it is like to be for either the LLM, or
> something inside it.
>
Not sure what you mean by something inside it. A philosopher named Thomas
Nagel wrote a famous paper titled something like “What is it like to be a
bat?” That is the sense that I mean here. Do you think there something it
is like to be GPT-4? When you ask it a question and it replies, is it aware
of its own private first person experience in the sense that we are aware
of our private experience? Or does it have no awareness of any supposed
experience?
-gts
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230408/9730bfe2/attachment.htm>
More information about the extropy-chat
mailing list