[ExI] Uploads are self

Jason Resch jasonresch at gmail.com
Wed Mar 18 02:46:28 UTC 2026


On Tue, Mar 17, 2026 at 6:40 PM Ben Zaiboc via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

>
> On 17/03/2026 17:19, Jason Resch wrote:
> >
> >     >     Ben wrote:
> >     >>     But the main thing that's required to actually believe (small
> 'b' version) this, is a materialistic mindset (as in, a complete rejection
> of dualism), and that's not very common so far.
> >     >
> >     >
> >     > I think that is part of it, but even many materialists hold that
> destruction of the body means death, and that any later instances are mere
> duplicates, who are not you.
> >
> >
> >
> >     Then they are what I usually call 'crypto-dualists', not
> materialists. Completely dispensing with dualism is difficult, I struggled
> with it for a long time, but once you do, you realise that 'mere
> duplicates' are, necessarily, actually you, in every way that matters.
> >
> >
> > I agree but what was the mental process of realization you went through
> to escape that? How would you argue with someone else that the specifics of
> a body doesn't matter? What thought experiments or reasons would you offer
> to show that the usual view doesn't hold?
> >
>
>
> It was reading 'Vast' by Linda Nagata that started me thinking more deeply
> about this.


For me it was also a piece of science fiction: the movie The Prestige.


> There's a character who is basically the mind of a spaceship who is
> conscious while the rest of the ship's complement are in suspension,
> because there's no magical faster-than-light travel, so the journeys take
> anywhere from decades to centuries, but somebody has to stay awake to react
> to any problems.
>
> The solution to keeping this person sane over long periods of time with
> not much to do except watch for things, is to 'reboot' them every 90
> seconds (I think it was 90 seconds, but a short time, anyway). In other
> words, if nothing of note happened, their memory was wiped clean and
> restarted, so they were always only conscious of the last 90 seconds at the
> most, unless something interesting was noticed. You might think of this as
> being 'killed' every 90 seconds then resurrected from a template.
>

I developed a similar thought experiment along these lines. Let's say that
tomorrow you were to be abducted by one of two alien species:

The first will conduct many ghastly experiments on you, many of which will
be painful, but not to worry, they will wipe your memory afterwards and
perfectly reset your body to how it was pre-experimentation
The second will perform the same set of experiments, except they will
perform them on a perfect clone of you, and upon completion of there
experiments will painlessly terminate this clone.

Should you be given the option, is there any reason to prefer the first or
second species of aliens? Note that all the same subjective states are
realized in both cases. One realizes them across time, while the other
realizes them across space. Personally I find both cases to be equivalent.


>
> It was trying to imagine myself in this position that eventually led me to
> realise that, in order to make sense of it, I had to lose any trace of
> dualistic thinking, and after a few false starts, I realised that it's the
> dualism that leads to all the 'problems' we have in thinking about these
> things. Fully accepting that our minds are patterns of information actually
> simplifies things, and it made me realise that saying things such as "a
> copy of me would not be me, but someone else" is total nonsense.
>

Yes, there would be no "difference that makes a difference" unless one
imputes one from the outside, but if everything is physically the same,
this factor would have to be a non-physical difference.


>
> The language we use when talking about these things actually makes things
> harder to understand. E.g. 'copy' carries some connotations that don't
> apply when talking about information, so saying something like 'a copy of
> my mind' is a bad way of putting it.


Right. These things are more like "types" than "tokens", all instances of
the letter 'A' are the same letter, despite all the distinct places it may
occur. I like your information analogy, where that's a case people readily
acknowledge there is no difference between one occurrence of a bit string
vs. another identical bit string -- it is impossible to say which is more
real, or more original than the other.


> Even just the phrase 'my mind' is wrong, and reinforces dualistic
> thinking, because if 'I' HAVE a mind, then I and my mind are separate
> things. So then what am I?
>

I think there is still utility in differentiating the word "I" to refer to
that universal property common to all minds. In the same sense that every
place is (to itself) a *here*, and every time is (to itself) a *now*, every
mind is (to itself) an *I*.
Laying it out this way also dissolves the powerful impression that there is
something special or unique about any particular mind feeling like a
privileged I. All mind states are experienced as I.
This can lead to the open-individualist/universalist realization: if all
conscious moments are experieced by as I (in a direct, immediate,
first-person way), then all experiences have everything they need to be
considered "mine." There is nothing else about experience that makes it
"yours" aside from the fact that it feels as if it is experienced in this
direct, immediate, first-person way.


>
> Hopefully, you can see that this is a red herring. If our minds are
> information patterns, then 'I' don't HAVE a mind, I AM a mind. The mind
> that my brain is producing is actually what 'I' am. I am information. A
> complex, dynamic pattern of information. It necessarily follows that if
> that information is read and then instantiated somewhere else, so that the
> information processing goes on in the same way as in my original brain,
> then that is me. I am now somewhere else.
>
> The logical consequence of this is that if the same information is
> instantiated in more than one place, there is now more than one me. Weird,
> yes, but necessarily true. And if my original brain is destroyed, but the
> mind that it used to produce is running in a different processing
> substrate, I'm not dead, I'm in that different processing substrate. Not a
> 'copy', but the actual real me.
>

Yes.


>
> Once you realise that minds are information, the confusion goes away. As
> John K Clark has said, science tells us that there are only 3 things:
> Matter, Energy and Information. (I'd modify that, and say there are only 3
> things: Space-time, Matter-energy, and Information, but it doesn't really
> matter). Minds can only be one of these things. Once you fully accept that,
> dualism can be dispensed with, and things like uploading and branching
> identity easily make sense.
>
> > What things do you be believe are necessary for one to survive? Would
> every synaptic weight have to be determined exactly, or is there some
> factor of "close enough" (say if it is as similar to how you were two weeks
> ago, that is sufficient)?
>
>
> The recent fruit-fly upload seems to suggest that individual synaptic
> weights are not actually necessary to record (which surprised me.
> Apparently it's the number of synaptic connections between neurons that's
> important. Maybe this won't be the same with human brains, but we'll see).
>
> My suspicion is that as long as you get the detailed connectome right
> (plus things like the type of neurons), this will establish 'attractor
> states' that are fairly tolerant to minor differences, so inaccuracies in
> things like connection strengths will not be so important, and maybe you
> would wake up feeling a bit strange, but that would soon fade as things
> settle down to their normal states. But that's just speculation, really. Or
> maybe wishful thinking, but I'd guess that uploading could actually turn
> out to be a lot easier than we think, given a certain level of technology
> (mainly for the scanning, I'm pretty confident that that will always be the
> hardest thing).
>

I think there is possibly one extra step you could take, one final dualism
to dispense with, which is the idea that you are defined by a
particular/exact information pattern. Certainly, throughout our lives, the
information pattern we identify with changes drastically. Are there any
limits to how much that conscious pattern could change before it ceases
being an "I", before it would stop feeling as though you are still there,
vividly having that experience? I think this too, is a last vestige of
dualism, in defining some "similarity function" which when satisified, you
live, and when not satisfied, you die. Note that in any case where your
personality, or memories are altered, that perspective will still feel 100%
certain that they are alive and have survived (despite the loss of memories
or personality change). So my challenge is to push back, and say similarity
(like bodily continuity) is another red herring, as far as subjective
survival is concerned and as far as defining "What am I?"

Jason
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20260317/6da3e193/attachment.htm>


More information about the extropy-chat mailing list