[ExI] Coherent vs. Incoherent Fears of Being Uploaded

Lee Corbin lcorbin at rawbw.com
Sun Jan 17 23:11:07 UTC 2010


The central question, of course, is whether one would
*survive* the uploading process.

(This emphasis on the concept of survivability I first
saw in Parsons (1984), "Reasons and Persons", which was
the very first up-to-speed account of identity in the
English language, so far as I know.)

We have discussed here what possible danger would
attend the replacement of neurons by artificial ones,
or to go further, say the replacement of an entire
neural tract connecting, for example, the amygdala and
the hippocampus. Whereas hitherto masses of neurons
fired, we now have a relatively simple electric electric
circuit, though it still has, of course, to implement
the tens of millions of functional inputs and outputs
realized by the former mass of neurons in the tract).

The wrong question, which has been repeatedly voiced,
is "would I notice?". This is completely wrong because
*noticing* is an active neural behavior which itself
is composed of the firings of millions of neurons.
Of course no "noticing" of that kind would occur,
because under the hypothesis, the entire effective
circuit between the hippocampus and the amygdala
has been functionally and accurately replaced by an
electric one.

Suppose you had a switch and a couple of movies
to watch. When the switch is in position A your
original neural tract operates, and when it's in
B, the electric circuit acts instead. During the
first movie you watch, you keep the switch in
the A position, and then watch the second movie
with it in the B position. It's just completely
wrong to wonder whether or not you'd later be able
to *report* (even to yourself) whether the first
movie was somehow more vivid.

To lose, even partially, that kind of subjective
experience is an incoherent fear.

Instead, the right question to ask is "Would I have
*less* experience, even though being completely unable
to report---even to myself---that this was the case?".
This is a coherent fear: one does not wish to be
zombiefied, not even a little, unless there were
no medical alternative to curing some malady.

And---so this coherent (but I think quite wrong) view
goes---the ultimate end of replacing all of your brain
by electronic circuitry would be the complete loss
of there being a subject (you) at all! Which is entirely
equivalent to death. In the language of some, here,
no more "qualia", and no more experience.

We come right back to the fundamental question: does
the functional equivalent supply the subjectivity,
i.e., supply the "qualia" of existence?

To me it seems completely bizarre and extremely
unlikely that somehow nature would have chosen to
bestow a "beingness" or consciousness on one
peculiar way for mammals to be successful:
our way, with gooey neurons and neurotransmitters.
And that had electronic or other means of
accomplishing the same ends for exactly the
same kinds of creature with high fitness been
supplied by nature instead, then magically no
consciousness, no qualia, and no subject.

It sounds absurd even to write out such a claim.

Lee



More information about the extropy-chat mailing list