[ExI] Essential Upload Data
Ben Zaiboc
ben at zaiboc.net
Sat May 16 08:45:00 UTC 2020
On 15/05/2020 08:09, Re Rose wrote:
> Hiya Ben,
>
> Hm, I didn't mean to change my mind LOL. Thanks for the link, I'mm
> familiar with the Carboncopy site and also their weekend online
> seminars, which are very interesting. I just don;t agree that the
> technology will result in the initiating agent's reanimation. It
> certainly has the ability to create a new agant, no question. But the
> new agent is not the person whose brain was copied. Its an entirely
> new type of derivitive being, in pssision of your memories and all
> your data. It will be interesting, and that's all I can say :)
>
> You are not at all a copy of who you were 2 seconds ago, you are one
> integrated agent and time has passed. You have more experiences, 2
> seconds more. But you are safely you.
>
> A perfect copy of your mind is not at all necessarily you! You are not
> only your mind, you are an integrated agent with a mind, and agent
> whose mind is tuned and accommodate to the body it inhabits. It may
> seem simple to rewire the cortex to a new body (and in fact I am
> excited to see how Carnavaro's head-switching surgety goes, as that
> can give insight into the rewioring of a cortex into a new body with
> many new senors, new hormonal cycles, and new system construction -
> withut dealing with lossy upload isses) but as I positied in previous
> posts I don't believe it is. In fact I think it may be too much for
> a human cortex to handle, and may induce mental issues due to the
> overwhelming nature of a mature brian re-learing all senosory input.
> Like a multi-year exposure to a non-ending LSD trip. Sounds awful to me.
>
> But, as I've alluded in prevoiuus posts, we are not close to
> understanding a copy of the data in the brain, and I belive it may be
> difficult or maybe impossible to reconstruct. So a "perfect copy" is a
> glib supposition. Its not a harddrive.
>
> An amoeba is a different system. Much simpler, and of course you are
> correct, asking which one of a split amoeba is the "real" one is
> meaningless. However an amoeba is not a brain, or a person. It's not
> even self-aware because it does not have any neurons, nor any memory.
> It's just a nice little machine - a reactive cell. I had a train when
> I was little, it had these bumbers on it and when it hit a wall or the
> furniture (or my Dad) it reversed. In this way it looked like it was
> exploring the room, and I loved it. But it had no memory, even though
> every exploration it took was different, that was so because it was
> iteritive and mechanical. It seemed alive but it was too simple to be
> so. An amoeba is similar. So a copy of an amoeba is fine, because the
> amoeba-biological-machmie is not really self-aware in the first place.
> IMHO!! YMMV! Of course :)
>
Aha! Retrieved my original reply:
It certainly does (my mileage vary).
What you don't seem to appreciate (or accept) is that 'a copy' and 'an
integrated agent, and time has passed' are the same thing, when you look
at things in detail.
You accept that atoms (i.e. material) are not important, and information
(the arrangements and interactions of atoms) is, yet insist that a copy
(the same information) would somehow not be the same 'you'. This is
inconsistent. At least, it is as long as you accept that atoms and
information are the only players of any significance. If you are
information, and the same information appears somewhere else, you appear
somewhere else. This is inescapable.
You say "You are not only your mind, you are an integrated agent with a
mind". If that is so, then the 'you' is a separate thing to 'your' mind.
What is that thing? As far as I can see, I'm not a separate thing that
'has a mind', I am a mind. Saying 'you are not only your mind' is really
just saying that 'mind' consists of more than we normally think it does.
I don't disagree that I'm an integrated agent, but that's 'my mind' (or
rather, the mind that is me). They are the same thing. And whatever it
encompasses, that is all embodied information. So if the information can
be extracted and embodied in another form, then I am there, in that
other form.
I don't see how reproducing the inputs that a human cortex is designed
to handle could be too much for a human cortex to handle. Isn't that an
oxymoron? Getting your wires crossed, and having auditory input go to
your visual centres, and so on, would indeed be terribly confusing, and
possibly drive some people mad, but that's an engineering glitch, not a
fundamental objection to uploading. It just means the job was botched
and needs to be done again. Undoubtedly there will be such glitches,
that we'll learn from before getting it right. The only objection I can
sensibly come up with is "I wouldn't want to be the first upload".
When I say "a perfect copy", I mean a "good enough copy", naturally. We
don't yet know what that will consist of. Again, that's not a
fundamental objection, it's just a lack of knowledge which we will
overcome sooner or later.
I picked an Amoeba as an example deliberately, because it's a living
system. It uses the same mechanisms as our neurons for all the basic
functions of a living cell. If you can't tell the difference betwee the
'original' and the 'copy', then the same would be true if a neuron was
to multiply all its internal structures and split in the same way. If
the two daughter neurons also retained all the connections, there would
be literally no way to tell them apart, and if we were to destroy one of
them, it wouldn't matter in the slightest which one was picked.
The same would be true of an entire brain that was duplicated. Or an
entire body. At the split second that the copy was made, one could be
destroyed and it wouldn't make any difference. After that, of course,
the two duplicates would start to accumulate different experiences, and
become two different people, but with common memories. Each would be as
different from the original as each other, though. Each would have an
equal claim to be a continuation of the original person (which only
exists in the past).
--
Ben Zaiboc
More information about the extropy-chat
mailing list