[ExI] Uploads are self
Ben Zaiboc
benzaiboc at proton.me
Thu Mar 19 15:45:09 UTC 2026
On 19/03/2026 11:15, Jason Resch wrote:
> On Wed, Mar 18, 2026, 6:06 AM Ben Zaiboc via extropy-chat <extropy-chat at lists.extropy.org> wrote:
>
> On 18/03/2026 04:03, Jason Resch wrote:
>
> Ben wrote:
>
> If you claim that there is nothing special or unique about any particular individual mind, you also have to claim that there is nothing special or unique about any particular piece of music, any particular mathematical equation or any other particular pattern of information. That line of thinking leads to the conclusion that all information is the same thing. Not a particularly useful viewpoint.
>
>
>
> I think you misread what I was saying. I was not saying there's nothing unique or special about any mind, I was saying there's nothing unique or special about a mind in the sense of it being a "privileged I". This is because every mind, from its own perspective, feels it is privileged in this way, just as contemporaries in every point in time, consider their "now" to be the special (only existing) point in time.
>
> But the more scientifically valid "block time" view of the universe dissolves the idea of a privileged now, just as open individualism dissolves the notion of a privileged I.
I think the language here is getting a bit too complex, making it difficult to follow (I think you are contradicting yourself above, but I'm sure you don't think so, so some clarification is needed).
Can we say that each mind is a specific information pattern (which is a shorthand for 'a dynamic information pattern with certain characteristics, some of which we aren't yet sure of'), and that of course there are many things that different minds have in common?
And there will be some things that some minds have in common, and some things that all (known) minds have in common. Probably.
You claim that all minds of interest have 'subjective experience' in common. I agree (it's a bit of a tautology, really).
You seem to claim that this means that all minds are therefore the same (?). I disagree.
Apart from the obvious logical fallacy (If all chairs have legs, that doesn't mean that all chairs are the same), there's the tricky problem that we can't, even in principle, measure subjective experience, or compare it between different minds, so the statement "all minds have subjective experience" contains very little information, almost none, I'd say. Can we even define the term? It could be something entirely different for every different person, we'd never know.
I'd agree with "all minds (of interest) have some things in common", as that's quite obvious. All human minds have a ton of things in common, but they are all still separate minds (I'm going to stop saying 'unique', because that probably won't be true in the future (at least momentarily, in the scenario of duplicating mind-states).
I don't see any significance in your 'privileged I'. In fact, I'm not sure what it actually means. It would seem to mean that each person has a first-person perspective. But that's so obvious and trivial that I can't see it being a useful thing to note. And it certainly doesn't follow from that, that everyone is really the same person (if that isn't a caricature of your position. I don't /think/ it is, from what you say).
>
>
> >> My suspicion is that as long as you get the detailed connectome right (plus things like the type of neurons), this will establish 'attractor states' that are fairly tolerant to minor differences, so inaccuracies in things like connection strengths will not be so important, and maybe you would wake up feeling a bit strange, but that would soon fade as things settle down to their normal states. But that's just speculation, really. Or maybe wishful thinking, but I'd guess that uploading could actually turn out to be a lot easier than we think, given a certain level of technology (mainly for the scanning, I'm pretty confident that that will always be the hardest thing).
>
>
>
> > I think there is possibly one extra step you could take, one final dualism to dispense with, which is the idea that you are defined by a particular/exact information pattern.
>
>
>
>
>
> That's not dualism, that is the exact opposite of dualism.
>
> My whole point is that each mind does consist of a particular, exact information pattern, and nothing else. That this is what a 'soul' (if you should insist on using the word) actually is, that this is the only thing that a mind can be.
>
>
> But you said you could survive as an imprecise upload (giving the fruit fly as an example). For this to be true, a person must be more than an "exact information pattern." You've already loosened that definition to an approximate information pattern.
>
> If one steps into a teletransporter, and emerges on the other side having lost a single long term memory that they hadn't recalled in the past 10 years, is such a memory loss fatal to that transported persons subjective survival? I think not, but am curious to know what you think.
>
> Then repeat the consideration with more and more memories being lost in the process. At what point do these changes flip from the person surviving to the person dying?
We can quibble about what 'precise' means, but the fact is we just don't yet know what level of precision will be necessary for an accurate upload of someone's mind (I was just speculating about the 'attractor state' thing). There will probably be a spectrum, and some kind of consensus will emerge about just how 'precise' the information needs to be.
I don't think the example of 'losing a single long-term memory' is very realistic, given the nature of memories and the way we store them, but you are again asking questions that we don't have any answer to yet. Once we start trying to upload human minds, we will doubtless find out. I don't think there is any 'flip from surviving to dying', any more than there is a hard line between a biological creature living and dying.
Have you read Greg Bear's 'The Way' stories? There's a character in there known as 'the architect', that pretty much sums up these questions, as he is dead (murdered, if I recall correctly), then 're-assembled' from data and memories taken from different sources, because certain parties needed his knowledge and skills. There's some debate as to what degree this re-constituted person is 'really' the architect, because he is definitely missing some parts of his original mind. There's no definite answer to the question, and I don't think there can be.
>
>
> > Certainly, throughout our lives, the information pattern we identify with
>
>
>
> No, I have to stop you there. This is dualistic thinking. Or at least language that reinforces dualistic thinking. Who is the 'we' you refer to, if you are separating it from the information pattern? It would be better to say "the information patterns that we consist of"
>
>
> Okay you can use that wording. The patterns one consists of change drastically throughout one's life.
>
>
>
>
> > changes drastically. Are there any limits to how much that conscious pattern could change before it ceases being an "I"
>
>
>
> I hope that this question now answers itself.
>
>
> It answers it for me, but I am still not sure if our answers are aligned.
>
> Unless you are asking how simple can a mind be, which we don't currently know the answer to.
>
>
> No, I am asking what must be preserved in the pattern for survival (e.g. of an upload process). If you said it must be 100% identical, then I am afraid perfect uploading will never be realized. If you said some good enough approximation is all that is needed, then we can in theory survive an upload, but then you have broken the need for perfect identity of an information pattern. This raises the question: just what exactly is required to subjectively survive.
>
> You abandoned the notion that a specific group of atoms was necessary to survival.
>
> Now I ask to take the next step, which is to abandon the notion that a specific pattern of information is necessary to survival.
>
> Certainly getting the pattern close is important for preserving what is important to each of us: one's memories, personality, and goals. But my argument is it is of absolutely no importance when it comes to the question of subjective survival. The person who emerges on the other side of the upload will consider themselves to have survived the process even if they lose memories in the process.
I suppose that depends. We can imagine that someone missing significant chunks of memory would be aware of it, and feel that they are incomplete in some sense. If they lost most of their memories, would they be a different person? I don't know the answer to that. I don't think anyone does. There will be certain features of the information pattern that are essential to having a conscious mind at all, of course, but between that and an 'exact' replica, there will be a large grey area, I expect. But there won't be a clear line, on one side of which you are 'the same person', and on the other side of which you are 'a new person'.
I don't think you can reasonably say it's of 'absolutely no importance' though. I doubt you would be happy to undergo uploading if you knew that it would remove or substantially change your memories, personality, and goals. I certainly wouldn't be, and I'd be asking what the hell kind of uploading is that? It reads like the kind of personality reprogramming that criminals undergo in some SF stories, in order to turn them into 'model citizens' (or the kind of thing that would have the chinese communist party rubbing their hands in glee!).
Someone with some kinds of brain damage can change their personality, or lose important parts of their memories. Are they still the same person?
But all this doesn't matter. We will obviously do our best to replicate as closely as we can, within the limits that animal experiments establish, the original mind. I don't really see the point of all this talk of incomplete uploads, missing memories etc., when we will do what we can to avoid them. I'm sure that after we have perfected uploading to some degree, we will want to investigate these issues, but it's just not relevant now. When you want to build a bridge but don't know a lot about bridge-building, you over-engineer it, to do your best to make sure it will work. You don't try to figure out what is the weakest or cheapest, etc., bridge you can build that will still be safe, that stuff comes later.
>
>
> I expect there are certain features, which we don't yet know, that will determine whether an information pattern can be regarded as a mind, or that will give rise to subjective experience. If you're asking what those features are, the only answer anyone can give at present is "We don't know". I suspect we'll find out eventually.
>
>
> These are exactly the sort of questions one must ask to break through to seeing the unimportance of particular details in the pattern as being necessary to subjective survival.
You are assuming a conclusion here. My suspicion is that 'particular details' will be very important - vital, even - for subjective survival, but we don't know what they are. Let's get the answers before drawing any conclusions. This will have to wait until we have the technology needed.
--
Ben
More information about the extropy-chat
mailing list