[ExI] GPT-4 on its inability to solve the symbol grounding problem
Jason Resch
jasonresch at gmail.com
Wed Apr 12 17:23:14 UTC 2023
On Wed, Apr 12, 2023 at 11:50 AM Gordon Swobe <gordon.swobe at gmail.com>
wrote:
> I mentioned Thomas Nagel and what is called the explanatory gap. You've
> probably heard of his paper "What is it like to be a bat?" I find myself
> largely in agreement with Nagel.
>
I am familiar with it.
>
> "But fundamentally an organism has conscious mental states if and only if
> there is something that it is like to *be* that organism -- something it is
> like *for* the organism.
>
I agree.
> We may call this the subjective character of experience. It is not
> captured by any of the familiar, recently devised reductive analyses of the
> mental, for all of them are logically compatible with its absence.
>
This I do not agree with. This is the thinking that leads one to believe
qualia are epiphenomenal, and inessential, which leads to zombies, and
zombie twins, zombie earths, etc.
The story about Mary the color scientist is taken from a 1982 paper called
Epiphenomenal Qualia, by Frank Jackson. In it, he argued that qualia are
epiphenomenal, that is, they have no physical effects, and are completely
physically unnecessary. You could remove them from the world, and nothing
would change. Some years later, he realized that this position was absurd.
From:
https://philosophybites.com/2011/08/frank-jackson-on-what-mary-knew.html
FJ: “Epiphenomenalism was unbelievable, and indeed that was a consideration
that eventually made me change my mind.”
Interviewer: “So why did you change your mind?”
FJ: “Well, the biggest factor was the picture of myself writing
‘epiphenomenal qualia’, but not being caused to write ‘epiphenomenal
qualia’ by qualia. I said in ‘epiphenomenal qualia’ that you had to be an
epiphenomenalist about qualia, and what that meant was that qualia didn’t
change the words that came out of my mouth or the movements of my pen on
pieces of paper, so that meant that when I gave the talk defending
‘epiphenomenal qualia’, when I wrote the paper defending ‘epiphenomenal
qualia’, the qualia weren’t causing the talk and they weren’t causing the
writing, and I just decided this was sort of unbelievable. [...] It was the
picture of myself writing the paper, uncaused by the qualia.. I said that I
can’t believe this. And I came to think that was the triumph of
philosophical cleverness over common sense.”
> It is not analyzable in terms of any explanatory system of functional
> states, or intentional states, since these could be ascribed to robots or
> automata that behaved like people though they experienced nothing.
>
The possibility of unconscious automata that behave like people (i.e.
p-zombies) is stated here as an assumption, it's not an argument. The idea
that such a functionally rich system that is able to act exactly like a
human in all circumstances and yet not be conscious, may be as logically
inconsistent as an atomically identical copy of a healthy person that is
unhealthy. To get a better picture for the preposterousness of zombies,
consider these passages:
“The zombie scenario posits that we can conceive of persons who behave
exactly as we do, but who lack inner experience. To pull off this trick, it
is necessary to invoke strategies to completely sequester consciousness
from anything that people say or do. The cost is that what ends up being
described is not what we usually think of a person at all. Within a
passive-mentalist approach, a person is not an integrated whole of
phenomenal experience and behavior. Rather, they are effectively a zombie
carrying around a sealed box labeled “mental stuff.” And their physical
selves will never know what’s inside the box. Were they allowed to look
inside and become aware of the mental aspects of their existence, the
knowledge they gained would inevitably affect their behavior, which is
against the rules. The fact that passive mentalism admits the
conceivability of zombies implies that what it purports to explain is not
consciousness as we know it.”
-- Sean M. Carroll in "Consciousness and the Laws of Physics
<https://philpapers.org/archive/CARCAT-33>" (2021)
“Consciousness, whatever it may be—a substance, a process, a name for a
confusion—is not epiphenomenal; your mind can catch the inner listener in
the act of listening, and say so out loud. The fact that I have typed this
paragraph would at least seem to refute the idea that consciousness has no
experimentally detectable consequences.”
Eliezer Yudkowsky in “The Generalized Anti-Zombie Principle” (2008)
Is the process of unconscious reflection, then, a path by which a zombie
could turn itself into a zimbo, and thereby render itself conscious? If it
is, then zombies must be conscious after all. All zombies are capable of
uttering convincing “speech acts” (remember, they’re indistinguishable from
our best friends), and this capability would be magical if the control
structures or processes causally responsible for it in the zombie’s brain
(or computer or whatever) were not reflective about the acts and their
(apparent, or functional) contents. A zombie might begin its career in an
uncommunicative and unreflective state, and hence truly be a zombie, an
unconscious being, but as soon as it began to “communicate” with others and
with itself, it would become equipped with the very sorts of states,
according to Rosenthal’s analysis, that suffice for consciousness.”
– Daniel Dennett in “Consciousness Explained” (1991)
“There are plenty of objections in the literature to the conceivability of
zombies.2 But the idea is so alluring that those who think zombies are
conceivable tend to feel there must be something wrong with the objections;
the zombie idea may be problematic (they say) but surely it is not actually
incoherent. I will argue that, on the contrary, it is indeed incoherent,
involving a grossly distorted conception of phenomenal consciousness.
(A) The e-qualia story is not conceivable.
(B) If zombies were conceivable, the e-qualia story would be conceivable.
Therefore zombies are not conceivable."
-- Robert Kirk in "The inconceivability of zombies
<https://www.academia.edu/47822464/The_inconceivability_of_zombies>" (2008)
> It is not analyzable in terms of the causal role of experiences in
> relation to typical human behavior -- for similar reasons.
>
I am not sure how to interpret this.
> I do not deny that conscious mental states and events cause behavior, nor
> that they may be given functional characterizations.
>
That's good. I agree with this.
> I deny only that this kind of thing exhausts their analysis.
>
I agree in the sense that the subjective feeling cannot be communicate
purely in objective terms.
> Any reductionist program has to be based on an analysis of what is to be
> reduced. If the analysis leaves something out, the problem will be falsely
> posed."
>
I agee that reductionism will not offer a solution to problems of mind.
Holism and emergentism seem more important to understanding the vastly
complex structures and patterns and relations which our minds invoke.
>
> What is Like to Be a Bat?
>
> https://warwick.ac.uk/fac/cross_fac/iatl/study/ugmodules/humananimalstudies/lectures/32/nagel_bat.pdf
>
>
What do you think would happen to a person whose visual cortex were
replaced with a functionally equivalent silicon computer?
A) They wouldn't notice and there would be no change in their subjectivity
or objectively observable behavior
B) They would notice the change in their subjectivity (perhaps noticing a
kind of blindness) but they would function the same as before and not say
anything
C) They would notice the change and they would complain about being blind
but would still be able to function as if they can see
D) They would notice and become functionally blind, not able to drive, walk
without bumping into things, etc.
E) Something else
Jason
> -gts
>
> On Mon, Apr 10, 2023 at 2:23 PM Gordon Swobe <gordon.swobe at gmail.com>
> wrote:
>
>> On Mon, Apr 10, 2023 at 1:53 PM Jason Resch via extropy-chat <
>> extropy-chat at lists.extropy.org> wrote:
>>
>>>
>>> What is the simplest possible conscious state that you can imagine? What
>>> are its contents?
>>>
>>
>> It might be, for example, a brief sensation and awareness of pain. Let us
>> say the pain of a toothache. I am an entirely unconscious being having no
>> subjective first person experience whatsoever and no awareness of such,
>> then for a moment, I become conscious and feel and note the subjective
>> experience of a toothache, then fall back into unconsciousness.
>>
>>
>> -gts
>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230412/4656a685/attachment.htm>
More information about the extropy-chat
mailing list