[ExI] Mind extension
Stathis Papaioannou
stathisp at gmail.com
Thu Feb 4 22:43:49 UTC 2010
On 5 February 2010 00:38, Ben Zaiboc <bbenzai at yahoo.com> wrote:
> I've been pondering this issue, and it's possible that there's a way around the problem of confirming that consciousness can run on artificial neurons without actually removing existing natural neurons, and condemning the subject to death if it turns out to be untrue.
>
> I'm thinking of an 'mind extension' scenario, where you attach these artificial neurons (or their software equivalent) to an existing brain using neural interfaces, in a configuration that does something useful, like giving an extra sense or an expanded or secondary short-term memory (of course all this assumes good neural interface technology, working artificial neurons and a better understanding of mental architecture than we have just now). Let the user settle in with the new part of their brain for a while, then they should be able to tell if they 'inhabit' it or if it's just like driving a car: it's something 'out there' that they are operating.
>
> If they feel that their consciousness now partly resides in the new brain area, it should be possible to duplicate all the vital brain modules and selectively anaesthetise their biological counterparts without any change in subjective experience.
>
> If the person says "Hang on, I blanked out there" for the period of time the artificial brain parts were operating on their own, we would know that they don't support conscious experience, and the person could say 'no thanks' to uploading, with their original brain intact.
>
> The overall idea is to build extra room for the mind to expand into, and see if it really has or not. If the new, artificial parts actually don't support consciousness, you'd soon notice. If they do, you could augment your brain to the point where the original was just a tiny part, and you wouldn't even miss it when it eventually dies off.
An important point is that if you noticed a difference not only would
that mean the artificial parts don't support normal consciousness, it
would also mean the artificial parts do not exactly reproduce the
objectively observable behaviour of the natural neurons.
--
Stathis Papaioannou
More information about the extropy-chat
mailing list