[ExI] Uploads are self

Jason Resch jasonresch at gmail.com
Sat Mar 21 17:24:27 UTC 2026


On Sat, Mar 21, 2026, 12:53 PM Ben Zaiboc via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> On 21/03/2026 12:38, Jason Resch wrote:
> >
> > On Fri, Mar 20, 2026, 7:55 AM Ben Zaiboc via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
> >
> >     On 20/03/2026 01:02, Jason Resch  wrote:
> >     > On Thu, Mar 19, 2026, 11:46 AM Ben Zaiboc via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
> etc (!)
>
> >     > I think we agree broadly about this, but that you may still be
> missing my point here. Think about the question: "Of all the beings that
> exist in the universe, how do you know which one is you?"
> >
> >
> >     I don't even understand the question. I don't have any access to
> anyone else's inner experience, let alone all the beings in the universe,
> so there's no need to identify myself to myself. I'd say this is a
> non-question.
> >
> >
> > What makes it such that when you upload an approximate capture of "Ben
> Z.'s brain state" into a computer and run it that you should suddenly then
> have access to the inner experiences of this computer brain emulation?
>
>
> Who are you talking about here?: "you should suddenly then have access"?
> I'm not sure what this means.
>


I am talking about subjective survival. By this I mean:


If Ben's biological brain dies, but it is scanned and uploaded into a
computer and run, and it continues to act like Ben, can the Ben whose bio
brain died, expect to live on, "subjectively* both "in" and "as" this
upload -- in the same sense as Ben would have expected to carry on
subjectively had his bio brain not failed him?


So far, you seem to be a avoiding this question, but it is the crux of what
my paper is attempting to answer. At the bottom of this email you hand wave
this question away and say it doesn't matter: so long as it is
functionally-equivalent then that's the best we can do and nothing more can
be said.

it is true no more can be said if we limit arguments to empiricism. But the
whole point of my write up is that *more can be said* if we expand to
include rational arguments, but you seem allergic to these.

As such I don't think we can make any more progress on this thread, unless
you are willing to engage on the question above. You may continue to say it
is an irrelevant meaningless question, and there is a certain logic in that
position, but it avoids what most people seek to preserve by freezing their
brain and later uploading. If you wonder why cryonics is less popular, it
is in no small part because so little has been done to try to answer this
very question which matters to so many.


Jason


The emulation will have subjective experiences, provided it is accurate
> enough to capture the processes that give rise to subjective experience.
> If the upload is 'approximate', as in, missing some information, I suppose
> it would feel as though I'd suffered some brain damage (or if it was
> something that I couldn't detect myself, other people would detect it).
> This, if it occurs, would probably be fixable, a lot more easily than with
> a biological brain.
>
>
> >
> >     > You don't decide who you are by checking the name on the ID card
> in your wallet. Instead you use the simple fact: "I am the one having the
> direct, immediate experiences of being Ben Z."
> >
> >
> >     Yes, because I am the only one who can.
> >
> >
> >     > In other words you rely on this feature of the subjective
> experiences you have access to, to decide which person (out of all the
> people in the universe) you happen to be.
> >
> >
> >     There's no need to rely on anything, because there is no other
> possibility.
> >
> >
> > If that's so, then uploading is a dead end.
>
>
> Does Not Compute.
> I have no idea why you're saying this.
>
>
> >
> > You need some principle that expands the set of "internal conscious
> states that you have access to," if you are to have any hope of subjective
> survival via uploading.
>
>
> Why on earth would that be necessary?
> I would need /exactly the same/ set of conscious states ('internal' is
> redundant) as my original brain. Nothing more, nothing  less (yes, that is
> also redundant, I'm just using it for emphasis). Anything else would not be
> an accurate upload.
>
>
> >
> >     I'm fine with a method that is not needed being flawed. As far as I
> can see, "deciding who it is you are" doesn't actually mean anything.
> >
> >
> > It is necessary to answer that question if you want to have hope of
> surviving as an upload.
>
>
> No, it's not.
>
>
> >
> >     > I am saying something a bit different than "we are all one" (which
> is ambiguous and mystical sounding). What I am saying is rather "all
> experiences are mine" because they all have what is required for any
> experience to be mine: they all feel as if they are mine.
> >     >
> >     > By this I am not saying all experiences are "Jason R.'s" or all
> experiences are "Ben Z.'s", I am saying all experiences have the properties
> required to make them mine -- every experience is felt as if it is
> happening to me (in a first person, direct, and immediate way).
> >
> >
> >     I can't make any sense of this at all.
> >     All experiences are not mine, only my experiences are.
> >
> >
> > You are using the word "my" to do all the heavy lifting in the above
> sentence.
>
>
> ???
> Ok, instead of the word "my experiences", I could say the experiences of
> the mind containing the self-referential model that is produced in the
> brain of Ben (biological or emulated). Is that long-winded enough to do the
> required 'heavy lifting'?
> The fact remains, I don't have access to any other experiences. Of course.
> There is no mechanism, even in theory, to achieve that.
>
>
> > How do you define the scope of experiences they are (or will be) yours
> vs. those that will always remain the experiences of others? This is the
> primary problem in the philosophy of personal identity.
> >
> >     It's impossible for me to experience something that someone else is
> experiencing.
> >
> >
> > But how do you distinguish self from someone else? The subject of this
> email thread is "are uploads self?" What makes it such that this computer
> over here, by running a particular sort of program, turns into something
> that will create experiences that *you* will have? But then, if we change
> the program slightly, then suddenly the experiences it generates are *no
> longer* the sorts of experiences you will have?
> >
> > Explain to me how you think this works.
>
>
> /There is no need to 'distinguish self from someone else'/, it's just a
> natural consequence of being conscious.
> That only means anything from a third-person perspective. How does Bill
> know that Ben is not Kevin? etc.
> From a first-person perspective, you are asking "How does Ben know that
> he's not anyone else except Ben?"
> Can't you see what a stupid question this is?
>
>
> >     > [Some Gibberish]
>
> >     This is gibberish. I thought you didn't want to be 'ambiguous and
> mystical sounding'.
> >
> >
> > Well this gibberish is the inevitable result that follows from not
> requiring 100% exact instances [in order to] to survive as an upload. You
> can either accept these consequences OR say that if one neural weight is
> not exactly right, that you won't survive the upload process.
>
>
> "100% exact". What does that mean? If you're talking about individual
> neural weights, why not talk about the exact nature and position and state
> of every molecule? Why not go down to the  quantum states of every particle?
>
> There is no such thing as "100% exact". We will doubtless find out what
> degree of accuracy is needed, as we do more animal studies, and test the
> uploads against original behaviour. The fruit fly upload indicates that we
> will need a connectome (pretty obvious) and the degree to which each neuron
> affects its connected neurons (seems to be the number of synapses, so far).
>
> So you're asking "what if one single connection is not the same as in the
> original brain?". I doubt if anyone (except perhaps a philosopher) would
> worry about that. I suppose there will be a percentage of errors that will
> be acceptable, beyond which we can reasonably doubt that we are actually
> creating a meaningful upload of the original brain. I don't know what that
> percentage will be, and neither does anybody else yet.
>
> Next, what do you mean by "survive" the upload process? I find it an odd
> thing to say. It would seem to make more sense to say "the upload is
> successful".
>
> ...
>
> >
> >     Fortunately, I don't need to go with Alcor, which is looking more
> and more of a risky choice, given recent events in the US.
> >
> >
> > What alternatives have you looked at, if you don't mind my asking?
>
>
> https://www.tomorrow.bio/
>
> ...
>
> > Could you point out the mistake I made in this section, and describe an
> experiment that empirical science could perform, using any conceivable
> future technology, that would settle the question I present below:
> >
> > ___________________________________________
> > Before we get into the philosophical arguments, it is worth taking some
> time to see why such arguments are the only path available to progress on
> these questions.
> >
> > The reason is that empirical science, being that which is practiced by
> way of objective experiments, cannot answer these questions in a
> satisfactory way. This remains true no matter how advanced technology
> becomes in the future.
> >
> > Consider the case where we transferred John’s biological brain into a
> functionally-equivalent silicon brain in a new robot body. What we are
> interested in is whether John’s original self has survived the transfer to
> this new body.
> >
> > But no matter what question we ask of this robot instance of John, it
> will (owing to functional equivalence) always give the same answers as had
> we asked the original John with his biological brain.
> >
> > If we ask, “Hey John, are you in there?” He’ll answer, “Yes, I made it!
> I am here.”
> >
> > If we ask, “Do you still feel like yourself?” He’ll answer, “I feel the
> same as before.”
> >
> > If we ask, “But is it the real, original you?” He’ll answer, “Yes, it is
> me. I survived!”
> >
> > John’s insistence that he has survived is fully predictable from the
> mere fact of functional-equivalence between the biological and silicon
> brains. They behave the same and hence will give the same answers in reply
> to the same questions.
>
> Exactly!
>
> The 'mere fact' of functional equivalence is all that is needed. That is
> what uploading is supposed to achieve, and that is what I, and I expect the
> vast majority of others who are interested in uploading, want. If the
> upload is functionally equivalent to the original brain, then it worked.
>
> To determine that, we would, of course, be doing more than just asking the
> questions you give above. We would probably be doing lots of tests, and
> basically calibrating the upload with a ton of data from the biological
> original (as well as data about generic human brains). Ideally, anyway. I'm
> sure there will be cases where this is not possible, and we'll just have to
> trust in the fidelity of the scan. But by the time human uploading is
> attempted, there will be a solid body of animal research confirming that
> the upload indeed corresponds, in every practical way, to the original
> animal.
>
> You say that science 'cannot answer these questions in a satisfactory
> way'. Of course it can. If you require anything else, you are wandering
> into the (non-existent, except as concepts in certain minds) realms of
> mysticism, and invoking dualism.
>
> --
> Ben
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20260321/71d0a1da/attachment.htm>


More information about the extropy-chat mailing list