[ExI] Some new angle about AI

Stathis Papaioannou stathisp at gmail.com
Tue Jan 5 07:15:50 UTC 2010


2010/1/5 Lee Corbin <lcorbin at rawbw.com>:

> Yes, that's it. It is logically conceivable, after all, as
> several on this list maintain, that every time you replace
> any biologically operating part with a mechanism that, say,
> does not involve chemical transformations, then your
> experience is diminished proportionally, with the end
> result that any non-biological entity actually has none
> of this consciousness you refer to. While *logically*
> possible, of course, I consider this possibility very
> remote.

If your language centre were zombified, you would be able to
participate normally in a conversation and you would honestly believe
that you understood everything that was said to you, but in fact you
would understand nothing. It's possible that you have a zombified
language centre right now, a side-effect of the sandwich you had for
lunch yesterday. You wouldn't know it, and even if it were somehow
revealed to you, there wouldn't be any good reason to avoid those
sandwiches in future. If you think that such a distinction between
true experience and zombie experience is incoherent, then arguably it
is not even logically possible for artificial neurons to be
functionally identical to normal neurons but lack the requirements for
consciousness.


-- 
Stathis Papaioannou



More information about the extropy-chat mailing list