[ExI] GPT-4 on its inability to solve the symbol grounding problem
Jason Resch
jasonresch at gmail.com
Sat Apr 8 18:42:33 UTC 2023
On Sat, Apr 8, 2023, 12:14 PM Gordon Swobe <gordon.swobe at gmail.com> wrote:
>
>
> On Sat, Apr 8, 2023 at 9:31 AM Jason Resch <jasonresch at gmail.com> wrote:
>
>>
>>
>> On Sat, Apr 8, 2023, 10:45 AM Gordon Swobe <gordon.swobe at gmail.com>
>> wrote:
>>
>>>
>>> On Sat, Apr 8, 2023 at 3:43 AM Jason Resch via extropy-chat <
>>> extropy-chat at lists.extropy.org> wrote:
>>>
>>>
>>>> There is phenomenal consciousness. That I would call awareness of first
>>>> person non-sharable information concerning one's internal states of mind.
>>>>
>>>
>>> It is this phenomenal consciousness to which I refer. If you do not
>>> think there something it is like to be a large language model then we have
>>> no disagreement.
>>>
>>
>> I believe there is something it is like to be for either the LLM, or
>> something inside it.
>>
>
> Not sure what you mean by something inside it.
>
Just as each of our hemispheres is independently conscious, there may be
pieces within the transformer, sub modules, which are conscious in ways
that the entire transformer network is not.
A philosopher named Thomas Nagel wrote a famous paper titled something like
> “What is it like to be a bat?” That is the sense that I mean here. Do you
> think there something it is like to be GPT-4?
>
Yes.
I define consciousness as awareness of any kind, having a point of view,
having something it is like to be, these are all synonymous in my
understanding.
When you ask it a question and it replies, is it aware of its own private
> first person experience in the sense that we are aware of our private
> experience?
>
It has awareness. I don't know and can't say how it is like or unlike our
own, no more than I can say how much like or unlike my consciousness
experience is from yours. I believe there is at least a countable infinity
of different unique possible consciousness states, nearly as varied as
different mathematical objects in the platonic realm.
Or does it have no awareness of any supposed experience?
>
It has awareness in my opinion, as it has demonstrated a perception of the
words I feed it by virtue of crafting a sensible reply in response.
You can deny that awareness only if you find it consistent to speak of an
"unaware awareness."
Jason
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230408/52ef586e/attachment.htm>
More information about the extropy-chat
mailing list