[ExI] Uploads are self

Jason Resch jasonresch at gmail.com
Sat Mar 21 20:22:13 UTC 2026


On Sat, Mar 21, 2026 at 3:54 PM Ben Zaiboc via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> On 21/03/2026 18:06, Jason Resch wrote:
> > On Sat, Mar 21, 2026, 12:53 PM Ben Zaiboc via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
> >
> >     On 21/03/2026 12:38, Jason Resch wrote:
> >     >
> >     > On Fri, Mar 20, 2026, 7:55 AM Ben Zaiboc via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
> >     >
> >     >     On 20/03/2026 01:02, Jason Resch  wrote:
> >     >     > On Thu, Mar 19, 2026, 11:46 AM Ben Zaiboc via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
> >
> >     etc (!)
> >
> >     >     > I think we agree broadly about this, but that you may still
> be missing my point here. Think about the question: "Of all the beings that
> exist in the universe, how do you know which one is you?"
> >     >
> >     >
> >     >     I don't even understand the question. I don't have any access
> to anyone else's inner experience, let alone all the beings in the
> universe, so there's no need to identify myself to myself. I'd say this is
> a non-question.
> >     >
> >     >
> >     > What makes it such that when you upload an approximate capture of
> "Ben Z.'s brain state" into a computer and run it that you should suddenly
> then have access to the inner experiences of this computer brain emulation?
> >
> >
> >     Who are you talking about here?: "you should suddenly then have
> access"?
> >     I'm not sure what this means.
> >
> >
> >
> > I am talking about subjective survival. By this I mean:
> >
> >
> > If Ben's biological brain dies, but it is scanned and uploaded into a
> computer and run, and it continues to act like Ben, can the Ben whose bio
> brain died, expect to live on, "subjectively* both "in" and "as" this
> upload -- in the same sense as Ben would have expected to carry on
> subjectively had his bio brain not failed him?
> >
> >
> > So far, you seem to be a avoiding this question
>
>
> Well, I thought I was addressing the question all along.
> Ok, let me be clear: The answer is YES!
> Yes, OF COURSE that's what I expect. This is the whole point of uploading.
>
>
> > , but it is the crux of what my paper is attempting to answer. At the
> bottom of this email you hand wave this question away and say it doesn't
> matter: so long as it is functionally-equivalent then that's the best we
> can do and nothing more can be said.
>
>
> No, I'm not hand-waving the question away, I'm addressing it directly.
> Functionally equivalent is what we're aiming for, and if it's achieved,
> then Job Done, if not, then we have a problem. That is the top and bottom
> of it.
>
>
> > it is true no more can be said if we limit arguments to empiricism. But
> the whole point of my write up is that *more can be said* if we expand to
> include rational arguments, but you seem allergic to these.
> > As such I don't think we can make any more progress on this thread,
> unless you are willing to engage on the question above. You may continue to
> say it is an irrelevant meaningless question, and there is a certain logic
> in that position, but it avoids what most people seek to preserve by
> freezing their brain and later uploading. If you wonder why cryonics is
> less popular, it is in no small part because so little has been done to try
> to answer this very question which matters to so many.
>
>
> What I'm allergic to is this barmy idea of 'Open Individualism', which 1)
> makes no sense to me, and 2) is irrelevant to uploading.
>
> The rational argument is: IF functionalism is true (all signs point to
> 'Yes'), AND a successful copy of the brain's connectome and connection
> weights can be made and recreated in a suitable computing substrate capable
> of activating it (progress is being made towards this), THEN the original
> person lives on in the new substrate.
>
> That's about as clear and simple as I can make it.
>

Functionalism is a theory in the *philosophy of mind*. If one accepts
functionalism, then that is enough to establish *the uploaded mind will be
conscious*.

But functionalism is silent on the question of which experiences
instantiated in which places are experience you can expect to be yours.
This question falls squarely within the domain of the *philosophy of
personal identity*. No theory in the philosophy of mind even attempts to
answer this question.

Accordingly, if you want to make the further assumption that the
functionally-equivalent upload not only replicates your consciousness, but
that *you* will personally and subjectively experience life as this upload,
then you must state your assumed theory in the philosophy of personal
identity.

So far you have ruled out empty-individualism. You have also ruled out
bodily continuity versions of closed-individualism. You have vacillated on
whether perfect identity is required for pattern identity. If you require
perfect identity in the pattern, that is a version of closed individualism;
however, if you loosen this to allow imperfect identity of patterns in the
upload (after having already abandoned bodily continuity as important),
that leads to open-individualism.

You say this makes no sense to you. I am willing to help it make sense if
you ask any question about anything I have said or anything that remains
unclear.

Jason
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20260321/cfb8ce15/attachment.htm>


More information about the extropy-chat mailing list