[ExI] GPT-4 on its inability to solve the symbol grounding problem
Giovanni Santostasi
gsantostasi at gmail.com
Sat Apr 15 22:33:13 UTC 2023
Brent,
I think you are fixated on the idea that redness is the experience of
redness and that this experience locks in the secret of consciousness. I
would say that is almost the opposite. Redness is something the brain came
up to experience red, because red was useful to detect given it is
associated with things like ripe fruits (it is a simplification but useful
for my argument).
The system needs to be aware of this detection so it found a way to alert
itself of the presence of red in the external environment.
But this is not the essence of consciousness and it really doesn't matter
how it is achieved. Somehow being aware of things in the environment gave
rise to this complex phenomenon of awareness. It is an emergent behavior
and you would not be able to explain it in terms of "atomic" things like
redness. It is the transcendence of the atomic things, the fact that the
sum of the atomic things cannot reproduce the whole that makes
consciousness so interesting and difficult to understand. It is an emergent
phenomenon.
So even the idea of recreating the redness of somebody by focusing on the
"atomic" aspect of the experience (that is not atomic at all) misses
completely the point. Redness is not as simple as you claim to be (there
are no pixels) and it is not going to help us to understand what awareness
is.
Giovanni
On Sat, Apr 15, 2023 at 4:33 AM Brent Allsop via extropy-chat <
extropy-chat at lists.extropy.org> wrote:
>
> Hi Ben,
>
> "Association" will work, but you're missing the point, and talking about
> the wrong thing.
> If two people (or one person, at a different point in time) are
> associating the word Smaug with a different dragon, we are asking the
> question, what is the difference between the two dragons that the two
> different people are "associating" the word Smaug with?
> I prefer transducing dictionary, over "grounding" or "association" but
> everyone here was using grounding, so I switched to that. Because you have
> one physical representation (hole in a paper), that isn't rendess, and the
> transducing system interprets it to a different physical representation
> (+5volts), and so on. You achieve consciousness, when you transduce
> that +5 volts, and render a pixel into someone's conscious knowledge that
> has a subjective redness quality.
>
>
> On Sat, Apr 15, 2023 at 2:16 AM Ben Zaiboc via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> I have a suggestion.
>>
>> Instead of 'ground', try using the word 'associate'. That seems to me
>> more useful. 'Grounding' implies that there is a single basis for the
>> meaning of whatever is being 'grounded'. But we know that this can't be
>> the case, e.g. my example of Smaug. Different people will create
>> different associations for the word, depending on their prior knowlege
>> of dragons, the story it appears in, images of dragons, or a specific
>> image of this particular dragon, and loads of other associations. You
>> can't say that 'Smaug' is 'grounded' in any single thing, even for one
>> individual, never mind many, so using the term doesn't do justice to
>> what is actually happening. I think it actually obscures what's
>> happening, misleading us into assuming that a word can only be
>> associated with one experience (or one 'real-world thing', if you prefer).
>>
>> The same is true for things that actually do exist, like apples. There
>> are many many apples, all different, and many many experiences people
>> have associated with them. The word 'Apple' cannot possibly be based on
>> one single thing, it's an abstraction built from many associations.
>> Using the word 'grounded' obscures this fact.
>>
>> Now I'm waiting for someone to say "but 'associating' is not the same
>> thing as 'grounding'!". If I'm right, and 'someone' does indeed object,
>> I'd be interested in their justification for this, seeing as
>> associations is all we have to work with in any information-processing
>> system, including the brain.
>>
>> On the other hand, if there is no objection, why don't we give it a try?
>> Drop the word 'grounding' altogether, use 'associating' instead.
>>
>> For starters, the "symbol grounding problem" becomes "the symbol
>> association problem".
>> Suddenly, it doesn't seem so much of a problem, does it?
>>
>> Ben
>>
>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230415/46d5ed81/attachment.htm>
More information about the extropy-chat
mailing list