[ExI] Coherent vs. Incoherent Fears of Being Uploaded

Stathis Papaioannou stathisp at gmail.com
Mon Jan 18 11:11:25 UTC 2010

2010/1/18 Lee Corbin <lcorbin at rawbw.com>:
> The central question, of course, is whether one would
> *survive* the uploading process.
> (This emphasis on the concept of survivability I first
> saw in Parsons (1984), "Reasons and Persons", which was
> the very first up-to-speed account of identity in the
> English language, so far as I know.)

I assume you mean Parfit?

> We have discussed here what possible danger would
> attend the replacement of neurons by artificial ones,
> or to go further, say the replacement of an entire
> neural tract connecting, for example, the amygdala and
> the hippocampus. Whereas hitherto masses of neurons
> fired, we now have a relatively simple electric electric
> circuit, though it still has, of course, to implement
> the tens of millions of functional inputs and outputs
> realized by the former mass of neurons in the tract).

I'm not sure what a lesion in the place you describe would do. To make
the example easier, can we talk about the visual cortex?

> The wrong question, which has been repeatedly voiced,
> is "would I notice?". This is completely wrong because
> *noticing* is an active neural behavior which itself
> is composed of the firings of millions of neurons.
> Of course no "noticing" of that kind would occur,
> because under the hypothesis, the entire effective
> circuit between the hippocampus and the amygdala
> has been functionally and accurately replaced by an
> electric one.

You wouldn't notice if the replacement, however it was implemented,
functioned in exactly the same way as the original, including
producing the same consciousness. I think even someone who believed in
a soul would have to agree with that statement, although they might
say that only God could make such a functional replacement.

The problem is to postulate that it is possible to make a replacement
that is functionally identical except for the consciousness component,
and see where this leads. I think it leads to absurdity, and my
conclusion is that it is therefore *not* possible to make functionally
identical brain components (and by extension, brains) without

> Suppose you had a switch and a couple of movies
> to watch. When the switch is in position A your
> original neural tract operates, and when it's in
> B, the electric circuit acts instead. During the
> first movie you watch, you keep the switch in
> the A position, and then watch the second movie
> with it in the B position. It's just completely
> wrong to wonder whether or not you'd later be able
> to *report* (even to yourself) whether the first
> movie was somehow more vivid.
> To lose, even partially, that kind of subjective
> experience is an incoherent fear.


> Instead, the right question to ask is "Would I have
> *less* experience, even though being completely unable
> to report---even to myself---that this was the case?".
> This is a coherent fear: one does not wish to be
> zombiefied, not even a little, unless there were
> no medical alternative to curing some malady.

I don't see how this question is any different. If my entire visual
cortex were removed then I would have *no* visual experience, and I
would certainly notice, as would anyone who asked me about the movie.
If my entire visual cortex were zombified then again I would have *no*
visual experience, but I would report seeing normally, I would have
the same emotional reactions to the movie, I would be able to describe
it appropriately, and I would honestly believe that nothing had
changed. In both cases I am completely blind, but in the latter case I
don't notice it, and neither does anyone else.

Suppose you still think that it is coherent to speak of a distinction
between real vision and zombie vision. How do you know which sort of
vision you have right now? How do you know if the human visual cortex,
due to its complexity, evolved with only zombie vision? (Of course,
this fact would have to be revealed rather than discovered, since no
scientific test or subjective report would count for or against it).
And if you did have this defect and were offered an operation to
correct, knowing that everyone else who has had this operation behaves
just the same as before and says that everything looks just the same
as before, would you have it?

> And---so this coherent (but I think quite wrong) view
> goes---the ultimate end of replacing all of your brain
> by electronic circuitry would be the complete loss
> of there being a subject (you) at all! Which is entirely
> equivalent to death. In the language of some, here,
> no more "qualia", and no more experience.

Your qualia would be replaced with zombie qualia, which are
indistinguishable from and just as good as normal qualia.

Note that this is different to having *no* qualia. A human with a
visual cortex lesion still responds to visual stimuli as evidenced by
the pupillary reflex, and in cases of so-called blindsight can
correctly describe objects shown to him while claiming that he sees
nothing. The opposite can happen in Anton's syndrome: blind patients
stumble around walking into things while maintaining the delusional
belief that they can see normally. Also, a true philosophical zombie
has no qualia at all, and no understanding that it has no qualia
(because it has no understanding of anything). If this zombie's visual
cortex were put in your head you would immediately say that you had
gone blind, and you would indeed have gone blind, because it could not
be functioning like normal brain tissue, sending normal outputs to the
rest of your brain. If it had been functioning this way, then it would
not be from an unconscious zombie, but from a normally conscious being
or a being with zombie consciousness indistinguishable from normal

> We come right back to the fundamental question: does
> the functional equivalent supply the subjectivity,
> i.e., supply the "qualia" of existence?
> To me it seems completely bizarre and extremely
> unlikely that somehow nature would have chosen to
> bestow a "beingness" or consciousness on one
> peculiar way for mammals to be successful:
> our way, with gooey neurons and neurotransmitters.
> And that had electronic or other means of
> accomplishing the same ends for exactly the
> same kinds of creature with high fitness been
> supplied by nature instead, then magically no
> consciousness, no qualia, and no subject.
> It sounds absurd even to write out such a claim.

It's not absurd, just very unlikely to be the case. But zombie
consciousness is absurd.

Stathis Papaioannou

More information about the extropy-chat mailing list