[ExI] GPT-4 on its inability to solve the symbol grounding problem
jasonresch at gmail.com
Mon Apr 10 22:09:36 UTC 2023
On Mon, Apr 10, 2023, 5:51 PM Gordon Swobe <gordon.swobe at gmail.com> wrote:
> On Mon, Apr 10, 2023 at 3:26 PM Jason Resch via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>> On Mon, Apr 10, 2023 at 3:24 PM Gordon Swobe <gordon.swobe at gmail.com>
>>> On Mon, Apr 10, 2023 at 1:53 PM Jason Resch via extropy-chat <
>>> extropy-chat at lists.extropy.org> wrote:
>>>> What is the simplest possible conscious state that you can imagine?
>>>> What are its contents?
>> Thank you for this answer.
>>> It might be, for example, a brief sensation and awareness of pain. Let
>>> us say the pain of a toothache. I am an entirely unconscious being having
>>> no subjective first person experience whatsoever and no awareness of such,
>>> then for a moment, I become conscious and feel and note the subjective
>>> experience of a toothache, then fall back into unconsciousness.
>> In my view, pain is not simple, but rather a highly complex state,
>> involving many separate brain regions. Consider these passages on pain, for
>> Paul Brand, a surgeon and author on the subject of pain recounted the
>> case of a woman who had suffered with a severe and chronic pain for more
>> than a decade: She agreed to a surgery that would separate the neural
>> pathways between her frontal lobes and the rest of her brain. By all
>> accounts the surgery was a success. Brand visited the woman a year later,
>> and inquired about her pain. She said, “Oh, yes, it’s still there. I just
>> don't worry about it anymore.” While smiling she added, “In fact, it's
>> still agonizing. But I don't mind.
>> This shows that the sensation of pain can be perceived in a manner that
>> is separate from the unpleasantness of pain. As Minksy writes
>> "As I see it, feelings are not strange alien things. It is precisely
>> those cognitive changes themselves that constitute what 'hurting' is––and
>> this also includes all those clumsy attempts to represent and summarize
>> those changes. The big mistake comes from looking for some single, simple,
>> 'essence' of hurting, rather than recognizing that this is the word we use
>> for complex rearrangement of our disposition of resources."
>> As we know, pains can be of various types, such as dull, sharp, burning,
>> aching, etc. and also vary in intensity and location. A huge amount of
>> information is encoded in one's knowledge of pain, and to perceive it fully
>> requires the involvement and intercommunication of various disparate brain
>> Some examples of simpler states of human consciousness:
>> - A groggy person just waking, conscious only of the light shining in
>> their eyes
>> - A trained monk in quiet thoughtless solitude with eyes closed and a
>> mind empty of thoughts
>> - A person in a sensory deprivation tank who presses and feels the
>> light touch of one finger on the back of their opposite hand and focuses on
>> this feeling only
>> Then you might consider even simpler states of consciousness (assuming,
>> as you said, you believe other things besides humans are conscious):
>> - The consciousness of a mouse
>> - The consciousness of a slug
>> - The consciousness of a nematode
>> Do you think the above are conscious and that they are simpler than human
>> consciousness? Is it possible to go any simpler in your view?
> I understand pain (and other sensations/qualia) as irreducible and as
> having existence or ontology only in the first person. You go on about
> third person objective descriptions of pain, and how people might
> experience different kinds of pain, and theories about pain, and about the
> woman whose pain became tolerable or not painful at all after all surgery
> (but that is not pain!). Interesting but only so much noise to me. My
> toothache is painful and unpleasant to me, and that is how pain is defined.
> You've probably felt the pain of a toothache and you know what the word
> means. Nothing could be simpler.
That was my point regarding the surgery. Pain is composite of different
things: knowledge of pain, the discomfort of pain, the psychological
distress of pain, the grabbing and focusing of attention on the pain, the
desire for avoidance of and triggering avoidance seeking behavior of pain,
etc. Yes pain, as generally understood, is the combination of all these
things. But that means there are simpler states of consciousness, e.g. the
type of pain that woman felt, as knowledge of the pain without the distress
of the pain is a simpler conscious state. In her case, we know it's simpler
since they cut off one part of her brain and she was still able conscious
of something. So how much of the brain might we chip away without losing
that awareness? What is the smallest atom of consciousness that is possible?
If you ask me, I think the atom of conscious is the If-then-else construct.
The simplest binary discrimination of some statement or input that can put
a system in more than one distinct state.
> Yes I can infer that a mouse probably also feels pain, but now I am
> beginning to tread outside of the first person and my thoughts start
> turning into conjectures.
We tread those waters when we suppose other humans are conscious. As I
asked before, how do you know you aren't the first person with a gene
mutations on earth that makes you conscious? Our choice is then between
solipsism or conjecturing that other minds besides our own are conscious.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat