[ExI] The symbol grounding problem in strong AI
    Stefano Vaj 
    stefano.vaj at gmail.com
       
    Tue Dec 29 16:12:01 UTC 2009
    
    
  
2009/12/28 Stathis Papaioannou <stathisp at gmail.com>:
> I wouldn't know it if I were or suddenly became a zombie, but I would know
> it if, not currently being a zombie, part of my brain were replaced with
> zombie components.
The issue is of course pure non-sense for me, and I suspect we are
more or less on the same side anyway, but just for the sake of idle
discussion, how would you know?
It is unclear to me whether any difference exists in the "internal"
(?) status of philosophical zombies, the same being part after all of
their "behaviour", but if they have an illusion to be conscious you
would of course conserve it as well, if they do not you could not be
aware of the moment you stopped being...
Such awareness would require a homunculus trapped in the zombie and
screaming about the zombification of... whom, if he has not become one
himself?
-- 
Stefano Vaj
    
    
More information about the extropy-chat
mailing list