[ExI] Zombies are logically inconsistent: a proof
Adrian Tymes
atymes at gmail.com
Tue May 16 20:43:54 UTC 2023
On Tue, May 16, 2023 at 12:48 PM Jason Resch via extropy-chat <
extropy-chat at lists.extropy.org> wrote:
> On Tue, May 16, 2023 at 11:56 AM Adrian Tymes via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> On Tue, May 16, 2023 at 9:33 AM Jason Resch via extropy-chat <
>> extropy-chat at lists.extropy.org> wrote:
>>
>>> *If we accept #2*, that zombies cannot think, believe, or know, then we
>>> end up with zombies that can solve complex problems without thinking, and
>>> zombies that can win at Jeopardy despite not knowing anything. This
>>> thinking without thinking, and knowing without knowing, appears to be an
>>> equally inconsistent notion.
>>>
>>
>> And yet this is exactly what LLMs are allegedly doing, with their
>> internal models used as evidence for how they accomplish this.
>>
>> You're going to need a lot more than "appears to be" for a proof, because
>> to those who believe #2, there is no appearance of inconsistency here.
>>
>
> Note that the "appears" referred to the level of inconsistency in the
> former argument, not the argument itself.
>
> I have trouble conceiving of anything more convincing than zombies
> violating the law of noncontradiction, which is implied by both options:
>
> Def: Zombies are "¬conscious ^ behaviorally-equivalent"
> Def: B is a behavior (e.g. believing, knowing, thinking, or having the
> ability to write a book about consciousness) which implies consciousness
>
Here is where you run into trouble. Those who say that LLMs are
behaviorally equivalent to conscious people but are not themselves
conscious, define that no specific behavior implies consciousness. To say
otherwise would lead to the logical contradiction you note.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230516/98110bc9/attachment.htm>
More information about the extropy-chat
mailing list