[ExI] GPT-4 on its inability to solve the symbol grounding problem
Brent Allsop
brent.allsop at gmail.com
Sat Apr 8 21:49:49 UTC 2023
Hi Darin,
You saying this proves you don't yet understand what the image is trying to
say about all this.
Consciousness isn't a 'Hard Problem' it is just a color problem.
On Sat, Apr 8, 2023 at 3:41 PM Darin Sunley via extropy-chat <
extropy-chat at lists.extropy.org> wrote:
> I would go so far as to say, following Chalmers, that the phenomenon of
> qualia is one of the most important and least understood and appreciated
> deep hints as to the fundamental nature of the universe. It is all
> pervasively experienced - indeed we experience literally nothing else - and
> yet it cannot be localized in spacetime and appears utterly causally
> disconnected from any measurable physical phenomena. The entire phenomena
> bears a not-so-passing resemblance to ancient descriptions of the soul and
> related theological-anthropologic metaphysical constructs.
>
> Even what little apparent progress has been made - the reconstruction of
> visual imagery from probes in mammalian visual cortices - is illusory. Even
> if the image in our visual field is found literally written in dendrite
> links and neurotransmitter densities, it brings us no closer to
> understanding what is the nature of a piece of paper that /experiences/ the
> picture written upon it.
>
> On Sat, Apr 8, 2023 at 3:29 PM Darin Sunley <dsunley at gmail.com> wrote:
>
>> A bit late to the party, but I'll take my swing at it:
>>
>> The phenomenal conscious experience of redness is a thing our brain does,
>> not a thing 700 nm light does.
>>
>> Not only this, but there is no actual causal link between any specific
>> phenomenal conscious experience that we have been taught to label
>> "redness", and photons of 700 nm light. Different neural architectures can,
>> and may very well do generate different phenomenal conscious experiences
>> (qualia) in response to 700 nm light, and many neural architectures, while
>> capable of detecting 700 nm light striking their visual sensors, may
>> generate no phenomenal conscious experience in response thereto at all.
>>
>> The question of what a phenomenal conscious experience is, what generates
>> it, how it is generated in response to photons of a specific energy
>> striking a sensor, and what causes it to be one thing and not something
>> else, is all under the umbrella of Chalmers' "hard problem" of
>> consciousness.
>>
>> The first hard thing about the hard problem of consciousness is
>> convincing some people that it exists. Or as someone (it may have been
>> Yudkowskyor Scott Alexander) pointed out, p-zombies are indistinguishable
>> from normal humans, /except/ in the specific case where they happen to be
>> philosophers writing about phenomenal conscious experience and qualia.. :)
>>
>>
>>
>> On Sat, Apr 8, 2023 at 11:51 AM Brent Allsop via extropy-chat <
>> extropy-chat at lists.extropy.org> wrote:
>>
>>>
>>> I keep showing this image, attempting to communicate something:
>>>
>>> [image: 3_functionally_equal_machines_tiny.png]
>>> Sure, our elementary school teacher told us the one on the left is red,
>>> the one in the middle is green, and the one on the right is just the word
>>> 'Red'.
>>>
>>> But it is evident from all these conversations, that nobody here
>>> understands the deeper meaning I'm attempting to communicate.
>>> Some people seem to be getting close, which is nice, but they may not
>>> yet be fully there.
>>> If everyone fully understood this, all these conversations would be
>>> radically different.
>>> Even if you disagree with me, can anyone describe the deeper meaning I'm
>>> attempting to communicate with this image?
>>> What does this image say about qualities, different ways of representing
>>> information, and different ways of doing computation?
>>>
>>> How about this, I'll give $100 worth of Ether, or just USD, to anyone
>>> who can fully describe the meaning attempting to be portrayed with this
>>> image.
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> On Sat, Apr 8, 2023 at 10:27 AM Gordon Swobe via extropy-chat <
>>> extropy-chat at lists.extropy.org> wrote:
>>>
>>>>
>>>> On Sat, Apr 8, 2023 at 9:31 AM Jason Resch <jasonresch at gmail.com>
>>>> wrote:
>>>>
>>>>>
>>>>>
>>>>> On Sat, Apr 8, 2023, 10:45 AM Gordon Swobe <gordon.swobe at gmail.com>
>>>>> wrote:
>>>>>
>>>>>>
>>>>>> On Sat, Apr 8, 2023 at 3:43 AM Jason Resch via extropy-chat <
>>>>>> extropy-chat at lists.extropy.org> wrote:
>>>>>>
>>>>>>
>>>>>>> There is phenomenal consciousness. That I would call awareness of
>>>>>>> first person non-sharable information concerning one's internal states of
>>>>>>> mind.
>>>>>>>
>>>>>>
>>>>>> It is this phenomenal consciousness to which I refer. If you do not
>>>>>> think there something it is like to be a large language model then we have
>>>>>> no disagreement.
>>>>>>
>>>>>
>>>>> I believe there is something it is like to be for either the LLM, or
>>>>> something inside it.
>>>>>
>>>>
>>>> Not sure what you mean by something inside it. A philosopher named
>>>> Thomas Nagel wrote a famous paper titled something like “What is it like to
>>>> be a bat?” That is the sense that I mean here. Do you think there something
>>>> it is like to be GPT-4? When you ask it a question and it replies, is it
>>>> aware of its own private first person experience in the sense that we are
>>>> aware of our private experience? Or does it have no awareness of any
>>>> supposed experience?
>>>>
>>>> -gts
>>>>
>>>>> _______________________________________________
>>>> extropy-chat mailing list
>>>> extropy-chat at lists.extropy.org
>>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>>
>>> _______________________________________________
>>> extropy-chat mailing list
>>> extropy-chat at lists.extropy.org
>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>
>> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230408/4e1b5718/attachment.htm>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: 3_functionally_equal_machines_tiny.png
Type: image/png
Size: 26214 bytes
Desc: not available
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230408/4e1b5718/attachment.png>
More information about the extropy-chat
mailing list