[ExI] Essential Upload Data
Ben
ben at zaiboc.net
Fri May 15 20:17:07 UTC 2020
On 15/05/2020 17:28, Re Rose wrote:
> My biggest concern is that we might discover a technology that is
> "good enough" to fool people. It seems plausible that a copy, or a
> really good ASC preserved brain, should hold the data we need. That,
> IMHO, is a dangerous idea, because people will convince themselves
> that's all you need to preserve yourself. So the goal - to reanimate
> and continue to live your life - will not be met but YOU are not
> around to advocate for yourself! Your copy will be very happy to be
> reanimated, though, just as a stranger or a sibling might be. You are
> not your sibling, or a stranger, though, and you will not be there.
"How would a copy of you know it's not you? Hm. After all, it would be a
being with your memories and experiences and and I believe it will
believe it is you"
That sounds like a nonsensical question to me.
How could you tell that a copy of 'Imagine' by John Lennon was not in
fact 'Imagine' by John Lennon? After all, it would have all the same
notes in the same order, and it would sound just like 'Imagine' by John
Lennon.
You could argue that a piece of music is not the same as a mind, but
then you'd be arguing for something special, above and beyond
information, that constitutes a mind, and you've already said that you
don't claim that. I recognise that minds are not fixed patterns, whereas
a specific piece of music is, but that doesn't affect the argument. A
dynamic pattern of information just contains extra information that
describes how the pattern changes under particular circumstances. In
essence, this is no different to a static pattern. It's all information.
I can't really get my head around this concept that an identical copy (a
good-enough copy, really) of you isn't really 'you'. Who would it be? it
can't be Napoleon, it can't be Genghis Khan, it can only be you. Saying
"but it's only a copy!" is meaningless. Yes, it's a copy. A copy of you.
Ergo, you. It can't be anyone else, can it.
I wrote a post about the amoeba splitting, but my computer crashed and I
lost it. Basically, I said the amoeba is just like a neuron, and if one
of your neurons underwent the same fission process, you wouldn't be able
to tell any difference between the two daughter neurons, and if you
destroyed one, it wouldn't matter one bit which one was destroyed
(assuming the same connections to other neurons were preserved).
Extrapolate this to all the neurons in your brain, and you effectively
have a copy of your brain which is identical to the original. Do you
really think this would result in a person that was 'not you'?
Ben Zaiboc
More information about the extropy-chat
mailing list