[ExI] Zombies are logically inconsistent: a proof

Jason Resch jasonresch at gmail.com
Tue May 16 21:33:22 UTC 2023


On Tue, May 16, 2023, 4:45 PM Adrian Tymes via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> On Tue, May 16, 2023 at 12:48 PM Jason Resch via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> On Tue, May 16, 2023 at 11:56 AM Adrian Tymes via extropy-chat <
>> extropy-chat at lists.extropy.org> wrote:
>>
>>> On Tue, May 16, 2023 at 9:33 AM Jason Resch via extropy-chat <
>>> extropy-chat at lists.extropy.org> wrote:
>>>
>>>> *If we accept #2*, that zombies cannot think, believe, or know, then
>>>> we end up with zombies that can solve complex problems without thinking,
>>>> and zombies that can win at Jeopardy despite not knowing anything. This
>>>> thinking without thinking, and knowing without knowing, appears to be an
>>>> equally inconsistent notion.
>>>>
>>>
>>> And yet this is exactly what LLMs are allegedly doing, with their
>>> internal models used as evidence for how they accomplish this.
>>>
>>> You're going to need a lot more than "appears to be" for a proof,
>>> because to those who believe #2, there is no appearance of
>>> inconsistency here.
>>>
>>
>> Note that the "appears" referred to the level of inconsistency in the
>> former argument, not the argument itself.
>>
>> I have trouble conceiving of anything more convincing than zombies
>> violating the law of noncontradiction, which is implied by both options:
>>
>> Def: Zombies are "¬conscious ^ behaviorally-equivalent"
>> Def: B is a behavior (e.g. believing, knowing, thinking, or having the
>> ability to write a book about consciousness) which implies consciousness
>>
>
> Here is where you run into trouble.  Those who say that LLMs are
> behaviorally equivalent to conscious people but are not themselves
> conscious, define that  no specific behavior implies consciousness.  To say
> otherwise would lead to the logical contradiction you note.
>

I think the "no B exists" assumption: "No specific behavior nor any
aggregate set of behaviors implies the presence of a conscious mind." also
leads to contradiction.

Corrolary 1. Talking about one's innermost desires, thoughts, feelings,
sensations, emotions, beliefs, does not require consciousness.

Corrolary 2. One could claim to be conscious and be wrong for reasons that
neither they, nor any other person could ever prove or even know. That is,
there would be truths that stand outside of both objective and subjective
reality.

Corrolary 3. The information indicating the fact that one person is a
zombie while another is not would have to stand outside the physical
universe, but where then is this information held?

Corrolary 4. For there to be no behavior indicative of consciousness
implies no possible behavioral difference whether one is conscious or not.
This information concerning one's conscious mental states must stand
outside of physical chain of causality in one's mind in order to guarantee
behavioral equivalence.

It follows from corrolary 4 that we as physical beings can have no more
access to this information than any zombie does, i.e. we have no more (or
less) access to our own mental states than does a zombie.

One's consciousness, defined as one's ability to access and report
information concerning their internal mental states, must then be
equivalent to that of a zombie. Thus one is no more, nor less, conscious
than one's own zombie twin.

Corrolary 4 follows from our assumption that there is no behavior B whose
presence indicates consciousness. But it shows that consciousness must then
be equivalent between a conscious person and their behaviorally identical
zombie. But zombies were defined to not be conscious. This is a
contradiction if we presume ourselves to be more conscious than our zombie
twins.

Jason
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230516/7282be4a/attachment.htm>


More information about the extropy-chat mailing list