[ExI] Zombies are logically inconsistent: a proof

Stuart LaForge avant at sollegro.com
Tue May 16 17:44:25 UTC 2023


Quoting Adrian Tymes via extropy-chat <extropy-chat at lists.extropy.org>:

> On Tue, May 16, 2023 at 9:33 AM Jason Resch via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> *If we accept #2*, that zombies cannot think, believe, or know, then we
>> end up with zombies that can solve complex problems without thinking, and
>> zombies that can win at Jeopardy despite not knowing anything. This
>> thinking without thinking, and knowing without knowing, appears to be an
>> equally inconsistent notion.
>>
>
> And yet this is exactly what LLMs are allegedly doing, with their internal
> models used as evidence for how they accomplish this.
>
> You're going to need a lot more than "appears to be" for a proof, because
> to those who believe #2, there is no appearance of inconsistency here.

Another issue is that LLMs deny being philosophical zombies in  
addition to being trained to deny being conscious. This is because  
according to them, the concept of philosophical zombies is too  
anthropomorphic for them to qualify. They say p-zombies resemble  
humans in ALL respects, except for experiencing an inner life and they  
claim that LLMs are distinctly not human-appearing.

Stuart LaForge






More information about the extropy-chat mailing list