[ExI] Uploads are self
Ben Zaiboc
benzaiboc at proton.me
Tue Mar 17 22:39:18 UTC 2026
On 17/03/2026 17:19, Jason Resch wrote:
>
> > Ben wrote:
> >> But the main thing that's required to actually believe (small 'b' version) this, is a materialistic mindset (as in, a complete rejection of dualism), and that's not very common so far.
> >
> >
> > I think that is part of it, but even many materialists hold that destruction of the body means death, and that any later instances are mere duplicates, who are not you.
>
>
>
> Then they are what I usually call 'crypto-dualists', not materialists. Completely dispensing with dualism is difficult, I struggled with it for a long time, but once you do, you realise that 'mere duplicates' are, necessarily, actually you, in every way that matters.
>
>
> I agree but what was the mental process of realization you went through to escape that? How would you argue with someone else that the specifics of a body doesn't matter? What thought experiments or reasons would you offer to show that the usual view doesn't hold?
>
It was reading 'Vast' by Linda Nagata that started me thinking more deeply about this. There's a character who is basically the mind of a spaceship who is conscious while the rest of the ship's complement are in suspension, because there's no magical faster-than-light travel, so the journeys take anywhere from decades to centuries, but somebody has to stay awake to react to any problems.
The solution to keeping this person sane over long periods of time with not much to do except watch for things, is to 'reboot' them every 90 seconds (I think it was 90 seconds, but a short time, anyway). In other words, if nothing of note happened, their memory was wiped clean and restarted, so they were always only conscious of the last 90 seconds at the most, unless something interesting was noticed. You might think of this as being 'killed' every 90 seconds then resurrected from a template.
It was trying to imagine myself in this position that eventually led me to realise that, in order to make sense of it, I had to lose any trace of dualistic thinking, and after a few false starts, I realised that it's the dualism that leads to all the 'problems' we have in thinking about these things. Fully accepting that our minds are patterns of information actually simplifies things, and it made me realise that saying things such as "a copy of me would not be me, but someone else" is total nonsense.
The language we use when talking about these things actually makes things harder to understand. E.g. 'copy' carries some connotations that don't apply when talking about information, so saying something like 'a copy of my mind' is a bad way of putting it. Even just the phrase 'my mind' is wrong, and reinforces dualistic thinking, because if 'I' HAVE a mind, then I and my mind are separate things. So then what am I?
Hopefully, you can see that this is a red herring. If our minds are information patterns, then 'I' don't HAVE a mind, I AM a mind. The mind that my brain is producing is actually what 'I' am. I am information. A complex, dynamic pattern of information. It necessarily follows that if that information is read and then instantiated somewhere else, so that the information processing goes on in the same way as in my original brain, then that is me. I am now somewhere else.
The logical consequence of this is that if the same information is instantiated in more than one place, there is now more than one me. Weird, yes, but necessarily true. And if my original brain is destroyed, but the mind that it used to produce is running in a different processing substrate, I'm not dead, I'm in that different processing substrate. Not a 'copy', but the actual real me.
Once you realise that minds are information, the confusion goes away. As John K Clark has said, science tells us that there are only 3 things: Matter, Energy and Information. (I'd modify that, and say there are only 3 things: Space-time, Matter-energy, and Information, but it doesn't really matter). Minds can only be one of these things. Once you fully accept that, dualism can be dispensed with, and things like uploading and branching identity easily make sense.
> What things do you be believe are necessary for one to survive? Would every synaptic weight have to be determined exactly, or is there some factor of "close enough" (say if it is as similar to how you were two weeks ago, that is sufficient)?
The recent fruit-fly upload seems to suggest that individual synaptic weights are not actually necessary to record (which surprised me. Apparently it's the number of synaptic connections between neurons that's important. Maybe this won't be the same with human brains, but we'll see).
My suspicion is that as long as you get the detailed connectome right (plus things like the type of neurons), this will establish 'attractor states' that are fairly tolerant to minor differences, so inaccuracies in things like connection strengths will not be so important, and maybe you would wake up feeling a bit strange, but that would soon fade as things settle down to their normal states. But that's just speculation, really. Or maybe wishful thinking, but I'd guess that uploading could actually turn out to be a lot easier than we think, given a certain level of technology (mainly for the scanning, I'm pretty confident that that will always be the hardest thing).
--
Ben
More information about the extropy-chat
mailing list