[ExI] Uploads are self
Ben Zaiboc
benzaiboc at proton.me
Mon Mar 23 21:32:41 UTC 2026
On 23/03/2026 16:21, Jason Resch wrote:
> On Mon, Mar 23, 2026, 6:10 AM Ben Zaiboc via extropy-chat <extropy-chat at lists.extropy.org> wrote:
>
>
> > Ben Wrote:
> > You're saying that rewiring someone's brain so as to turn them into someone else makes it difficult to distinguish between different minds?
> >
> > This is a new and exciting brand of nonsense.
> >
> >
> >
> > But it follows from your earlier (strong) insistence that all 5 uploads, despite each ending up with very different final mind states, would all be you.
> >
> > Jason:
> > >> Have you have become these 5 different people?
> > Ben:
> > > Of course.
> > > What other possibility is there?
> >
> > So I would ask you for a resolution, or at least an explanation, of this inconsistency, since above you deny the cases of survival in situations where ones mind has been modified to equal another.
>
>
> There's no inconsistency.
> The five me's will all remember their past as me. The overwritten me that becomes Bob, won't. That one will remember Bob's past, not mine.
>
>
> Think of the conscious experience of riding a bike enjoying the sun on your face and the wind in your ears, simply being in the moment.
>
> How much of your life's past memories are you recalling during such a conscious experience? If none, then what makes this experience any more Bob's than it is Ben's?
That's a silly question. If Bob is experiencing it, it's Bob's experience. You don't have to keep remembering who you are to actually be who you are. People don't turn into each other if they're not careful enough.
Are you suggesting that one person can somehow have another person's experiences? (what would that be, telepathy?)
I can only say that's never happened to me, and I don't know of any mechanism by which it could happen.
>
> If you forgot what you had for breakfast 217 days ago, does that mean it was someone else (besides you) who consciously experienced eating every bite of that breakfast? What has become of that person?
>
> Memories matter for giving a person context for their life, but they're largely irrelevant when it comes to having a feeling of being alive and conscious. You don't die or lose consciousness when there's a name you can't remember.
Ok. I don't think anyone's going to disagree.
>
>
> Tinkering with someone's brain to re-wire it so that it becomes a copy of someone else is effectively uploading that someone else over the top of the original brain, destroying its original pattern. In that case, the atoms remain (which is irrelevant), but the pattern is overwritten completely.
>
>
> Which neural weight change results in your death: the first, the 5,647,822,953th, the last, or something else?
I don't know. There would probably be a stage at which the partly-completed process wouldn't result in a viable person at all, either Bob or Ben, due to differences in the details of the connectomes. It would depend on the details, and would probably be different for different pairs of subjects.
>
>
> >
> > To clarify (or muddy) the situation, consider that Bob could in principle be arbitrarily close to you. (E.g. an identical twin with the same upbringing and same education and hobbies/interests).
>
> Not possible.
> You can have 'arbitrarily close' in maths (and philosophy), but not in real people.
>
>
> Across an infinitely large universe such arbitrarily close instances exist.
If someone was that similar, there would be two Bens. Changing one of them into the other wouldn't be necessary, they'd already be there. And don't ask me how close they'd have to be, I don't know. Let's call it 'arbitrarily close'.
You seem to be fascinated by the question of 'how close' different minds can or might be in structure, and fantastical implications of this. It doesn't matter. We are each like a little rabbit in a little box. It doesn't matter how many boxes with rabbits there are, it doesn't matter if there are a million identical rabbits, nearly-identical rabbits, or totally different rabbits, each rabbit is just itself in its own box, eating its own piece of lettuce. None of the others make a smidgeon of difference to any particular rabbit. If you wonder what would happen if you chopped a leg off two rabbits and swapped them over, the most likely answer is two lame rabbits, but they'd still be two individual rabbits.
> But regardless, denying the setup of a thought experiment to avoid a conclusion you don't like isn't helpful. Instead we should treat thought experiments that reveal a weakness in one's assumptions as a gift: an opportunity to refine and perfect one's thinking to better approximate reality.
This is not about conclusions I don't like, you're the one who's assuming certain things and trying to frame thought experiments to justify them. I'm just going on what we actually know about brains in the real world. Provisional facts that we have established from experiments on and observations of real things like brains.
>
>
> There will be a practical limit to how close two people can be, because they will have to have different experiences and memories, arbitrary differences in brain wiring, etc.
>
>
> We test theories by imagining situations in which they might break down. There's nothing physically impossible about the scenario I've proposed. It is worth thinking through.
>
> Indeed in cases like the quantum eraser experiment, it is arguable that two divergent mind states fuse back as one. Physicists already have to confront this possibility.
>
>
> > These are just classic fission/fusion cases of personal identity. Given they have revealed an inconsistency in your predictions, we must now attempt to identify the source of this inconsistency in your assumptions or reasoning. (We are now doing philosophy)
>
> There's no inconsistency, and no need for any philosophy.
>
>
> Then you should be happy to give your answer to what happens when the conscious states of two formerly different people intersect.
I can't give an answer to that question, because it's never been tried, and probably can't be done, so there's no relevant data. I'm not even sure what it could mean, let alone how it could be achieved. Could a memory, for instance, from person A be inserted into the mind of person B? I've no idea. As a thought experiment, I'd guess it would mean that person B would remember something that happened to person A. Theoretically. But actually? Nobody knows.
Anything more extreme could well result in a dysfunctional mind. Imagine trying to 'intersect' the mechanism of a grandfather clock with that of a pocket watch. But I don't know, maybe it could be done, maybe not. Without some experiments along these lines (which, of course, would be highly unethical), nobody can know. Thought experiments are all very well, but they need to be based on relevant data. Otherwise they are just wild imaginings. They also should produce testable predictions. A web search on "what testable predictions has theory of personal identity produced?" gives no results that mention actual testable predictions. Predictions, yes, but not ones that can be tested. If you can find any, please let me know.
>
> These are matters that can be decided with science. It doesn't matter what kind of wild thought-experiments can be dreamed up, in reality it's neurology and physics that will determine these things.
>
>
> As I've explained, there are questions that can't be decided by objective empirical means. This seems to be a question in that class.
>
> Physics reveals only how things behave. It doesn't tell us anything about what things are.
If physics can't tell us 'what things are', which other branch of science can? Science is the only tool we have for actually figuring things out. Nothing else has worked, ever.
--
Ben
More information about the extropy-chat
mailing list