[ExI] GPT-4 on its inability to solve the symbol grounding problem
Will Steinberg
steinberg.will at gmail.com
Sun Apr 9 00:42:41 UTC 2023
Brent, do you think perhaps people might understand the info and images in
your camp from your website, which you have posted probably hundreds od
times, and they just disagree with you?
On Sat, Apr 8, 2023, 1:50 PM Brent Allsop via extropy-chat <
extropy-chat at lists.extropy.org> wrote:
>
> I keep showing this image, attempting to communicate something:
>
> [image: 3_functionally_equal_machines_tiny.png]
> Sure, our elementary school teacher told us the one on the left is red,
> the one in the middle is green, and the one on the right is just the word
> 'Red'.
>
> But it is evident from all these conversations, that nobody here
> understands the deeper meaning I'm attempting to communicate.
> Some people seem to be getting close, which is nice, but they may not yet
> be fully there.
> If everyone fully understood this, all these conversations would be
> radically different.
> Even if you disagree with me, can anyone describe the deeper meaning I'm
> attempting to communicate with this image?
> What does this image say about qualities, different ways of representing
> information, and different ways of doing computation?
>
> How about this, I'll give $100 worth of Ether, or just USD, to anyone who
> can fully describe the meaning attempting to be portrayed with this image.
>
>
>
>
>
>
>
>
>
>
>
>
> On Sat, Apr 8, 2023 at 10:27 AM Gordon Swobe via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>>
>> On Sat, Apr 8, 2023 at 9:31 AM Jason Resch <jasonresch at gmail.com> wrote:
>>
>>>
>>>
>>> On Sat, Apr 8, 2023, 10:45 AM Gordon Swobe <gordon.swobe at gmail.com>
>>> wrote:
>>>
>>>>
>>>> On Sat, Apr 8, 2023 at 3:43 AM Jason Resch via extropy-chat <
>>>> extropy-chat at lists.extropy.org> wrote:
>>>>
>>>>
>>>>> There is phenomenal consciousness. That I would call awareness of
>>>>> first person non-sharable information concerning one's internal states of
>>>>> mind.
>>>>>
>>>>
>>>> It is this phenomenal consciousness to which I refer. If you do not
>>>> think there something it is like to be a large language model then we have
>>>> no disagreement.
>>>>
>>>
>>> I believe there is something it is like to be for either the LLM, or
>>> something inside it.
>>>
>>
>> Not sure what you mean by something inside it. A philosopher named Thomas
>> Nagel wrote a famous paper titled something like “What is it like to be a
>> bat?” That is the sense that I mean here. Do you think there something it
>> is like to be GPT-4? When you ask it a question and it replies, is it aware
>> of its own private first person experience in the sense that we are aware
>> of our private experience? Or does it have no awareness of any supposed
>> experience?
>>
>> -gts
>>
>>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230408/af4d7af3/attachment-0001.htm>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: 3_functionally_equal_machines_tiny.png
Type: image/png
Size: 26214 bytes
Desc: not available
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230408/af4d7af3/attachment-0001.png>
More information about the extropy-chat
mailing list