<div dir="auto"><div><br><br><div class="gmail_quote gmail_quote_container"><div dir="ltr" class="gmail_attr">On Sun, Mar 22, 2026, 11:00 AM Ben Zaiboc via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">On 22/03/2026 13:56, Jason Resch wrote:<br>
> On Sun, Mar 22, 2026, 8:08 AM Ben Zaiboc via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org" target="_blank" rel="noreferrer">extropy-chat@lists.extropy.org</a>> wrote:<br>
><br>
> On 22/03/2026 11:33, Jason Resch wrote:<br>
> > Consider the case where you upload your mind to 5 different computers at once.<br>
> ><br>
> > Subject each of the 5 instances to gradual modifications (via different experiences) so that you end up in the end with 5 very distinct persons with different memories and even personalities.<br>
> ><br>
> > Have you have become these 5 different people?<br>
><br>
><br>
> Of course.<br>
> What other possibility is there?<br>
><br>
><br>
> > If so what principle makes them them all you (when they're run on different computers and have different psychologies)?<br>
><br>
><br>
> I'm not sure what you mean by 'what principle'. Each one of them is an upload of you, that has changed in some way. Each one is no different to if you had remained biological and made the same changes (assuming the suggested changes are possible in a biological person).<br>
><br>
> Are you having difficulty with the 'branching identity' idea (do you not think that there can be 5 'you's), or is it something else?<br>
><br>
><br>
><br>
> I am illustrating the fact that identical psychological states are not bearers of identity.<br>
<br>
<br>
I'm not sure what this means. What's a 'bearer of identity'?<br>
Are you saying that identical psychological states are not different?<br>
<br>
<br>
><br>
> For if you could survive as these divergent people, then we can imagine surviving via more substantive modifications. For example slowly and gradually tweaking the weights of your brain over time to exactly equal the brain state of your friend Bob. Now then you must conclude that you have survived this process, and yet, now we have the situation where Ben's brain has become identical with Bob's brain.<br>
<br>
<br>
Don't be silly. How could I have 'survived this process', if I've been transformed into Bob? You are explicitly saying that Ben has been turned into Bob. That means that where there was a Ben, now there is a Bob. I wouldn't even be a Bob that remembered being Ben, because the brain state is exactly equal to Bob's.<br>
<br>
<br>
> Does this imply that you and Bob are now essentially same person?<br>
<br>
<br>
No, it doesn't imply anything. It explicitly states that there is no more me. There are now two Bobs, and no Bens. You are actually saying this, I don't know why you are asking these questions when the answers are so obvious.<br>
<br>
<br>
> Would backing up and uploading Bob's brain not count equally as preserving your brain? Could you then survive as an upload of Bob?<br>
<br>
<br>
Just read what you're writing!<br>
<br>
<br>
> What if we skipped the transformation process we subjected you to, and simply uploaded Bob? Would you survive as the upload in that case?<br>
<br>
<br>
Urgh. Facepalm.<br>
<br>
<br>
> If not, what does the act of putting Ben through the experience of morphing into Bob add to the situation?<br>
<br>
<br>
I don't understand what you're asking.<br>
It adds a Bob? (It also removes a Ben).<br>
<br>
<br>
> If this morphing from Ben to Bob were done in the Andromeda Galaxy unbeknownst to you here on earth, would that provide the necessary glue to stitch your minds and thus enable your survival?<br>
<br>
<br>
I don't understand what you're saying.<br>
<br>
<br>
> What if this morphing happened in some other branch of the multiverse?<br>
><br>
> Across the multiverse every physically allowable transformation between any two mind states happens somewhere.<br>
><br>
> Drawing unique personal boundaries around minds then becomes rather difficult.<br>
<br>
<br>
Nonsense. You're saying that rewiring someone's brain so as to turn them into someone else makes it difficult to distinguish between different minds?<br>
This is a new and exciting brand of nonsense.<br></blockquote></div></div><div dir="auto"><br></div><div dir="auto"><br></div><div dir="auto">But it follows from your earlier (strong) insistence that all 5 uploads, despite each ending up with very different final mind states, would all be you.</div><div dir="auto"><br></div><div dir="auto">Jason:</div><div dir="auto"> >> Have you have become these 5 different people?</div><div dir="auto">Ben:</div><div dir="auto">> Of course.</div><div dir="auto">> What other possibility is there?</div><div dir="auto"><br></div><div dir="auto">So I would ask you for a resolution, or at least an explanation, of this inconsistency, since above you deny the cases of survival in situations where ones mind has been modified to equal another.</div><div dir="auto"><br></div><div dir="auto">To clarify (or muddy) the situation, consider that Bob could in principle be arbitrarily close to you. (E.g. an identical twin with the same upbringing and same education and hobbies/interests).</div><div dir="auto"><br></div><div dir="auto">These are just classic fission/fusion cases of personal identity. Given they have revealed an inconsistency in your predictions, we must now attempt to identify the source of this inconsistency in your assumptions or reasoning. (We are now doing philosophy)</div><div dir="auto"><br></div><div dir="auto">Jason </div></div>