[ExI] Destructive uploading.
Florent Berthet
florent.berthet at gmail.com
Mon Sep 5 22:12:53 UTC 2011
Alan, I understand what you are saying and I'm probably not making myself
clear when I say every clone and upload "is you" (by the way, by "upload" I
mean that I have essentially copy-pasted your brain on a computer in a safe
way, resulting in two distincts persons. I'm not talking about a transfer of
mind which would result in a single thinking entity). I'm not saying that
your inner self changes when you are cloned, and of course each clone has
its own individual personal experience.
I'm not saying you are not unique, each person and each clone is unique, I'm
saying that even if there is indeed an original, the question of destructive
versus safe uploading is not relevant because both of them result in the
same thing. And I'm not saying this as "nobody else will notice so we don't
care if the original dies!", I'm saying this as "it makes intrinsically no
difference for anybody - not even for the guy we want to upload - if the
original dies in the process".
This is a tough thing to discuss, so bear with me: if I told you I was going
to clone you and torture your clone later on, would you be scared to be in
pain or would you just think "oh, the poor guy"? I guess you'd think the
latter. But if I told you I would torture you tomorrow, you'd probably be
scared for yourself. Well, what I'm saying is that, as counter-intuitive as
it is, you shouldn't be more afraid of being tortured yourself tomorrow than
you should be afraid of your clone being tortured. Ok, that's a bold
assumption, but if you really think about it, it makes sense. Why are you
afraid for yourself and not for your clone? You will say it's because you
are tied to your own nervous system, and not to your clone's, obviously.
Well, sure, but what is this "you" that is tied to this nervous system? It's
only a brain pattern. And don't think it is "your" brain pattern just
because it's made of "your" atoms, indeed these atoms could be interchanged
without any consequence; hence the examples were we change your atoms one
after another. These examples are useful to know where people draw the line
between "it's still me" and "it's another person".
So tell me where you put yourself:
- if I were to freeze your brain of any electrical activity for a second,
would you say you'd be the same after I turn you back on?
- what if I were to freeze your brain, take a small part out and put it back
safely while frozen, then turn you back on?
- what if I freeze it, disassemble it and recreate it atom by atom in your
skull without changing anything else. Would this still be you? Would you
mind if we did this?
If you say yes to these three, then you'd probably be afraid if I told you I
would torture this reconstructed you after your "surgery". And then,
logically, in the previous example where I torture your clone, you should be
afraid too, because this clone was no different than a reconstructed you
(the only difference in this case is that the original is still alive, but
why should this impact the clone's fate?). If you say no somewhere, then I'm
interested in your reasoning.
Yeah, sure you could copy my pattern. That doesn't mean I don't own it or
> that the act of copying my pattern would benefit me in any way.
That is precisely where lies the illusion. What I'm arguing is that the
concept of something benefiting somebody is meaningless because the notion
of a continuous "somebody" is wrong. I'm saying that consciousness is not a
continuous entity living in one's brain. Instead, consciousness is simply
something that happens when some conditions are met in a substrate,
regardless of the past conditions of the substrate. We feel we will be the
same person tomorrow, but it's just an illusion that serves our evolution,
it makes no sense physically. "You" won't wake up in your bed tomorrow,
instead, tomorrow, your body will spark a consciousness that will be very
similar to the one he created the day before, and that's it. You may think
that doesn't change anything but there's a huge difference, in the first
case you have a good reason to be concerned for your future, and in the
second case your future self is just another person.
What is it exactly that you are afraid to lose by going through a
destructive uploading?
My views require a stretch of the mind but they are not some kind of
esoteric BS, the reasoning is very pragmatic. We feel like our consciousness
"belongs" to us and that it will be so until our death. For that reason we
feel more concerned about our own fate than the fate of others. We believe
this because there's a continuity in our body: we know it's always alive and
going. Plus, each morning when we wake up, we feel like the same person as
the day before. But how could it be any different? Our brain contains
memories, and memories are what makes us feel like who we are.
In fact, if you feel more concerned by the fate of your own body, it's
because you think your brain is somehow important. You feel that it's not
just the mathematical pattern that matters (or else you'd also feel
concerned for your copies), but it's the fact that this pattern is made up
from "your" atoms. What is really hard to grasp is that there is nothing
special about these atoms: they don't hold any number or name. And it's not
about your neurons either: the fact that your neurons have always formed
some kind of electric current in your brain is not relevant, theoretically
we could briefly pause this current and there's no reason to think this
would result in a crucial switch of identity. You are just your
consciousness, which is just a spontaneous phenomenon made by a bunch of
atoms interacting with each other. The substrate is irrelevant. Therefore,
why would you think the brain of the "tomorrow you" is crucially different
than the brain of your clone? Both of them will produce a consciousness, and
these consciousnesses will be independent from each other, in the same way
that the consciousness of the "tomorrow you" will be independent from
you current one because it has no regard whatsoever for the history of the
substrate it comes from.
In a way, this last clipping is a major conceptual breakthrough. The
> admission that uploading serves no egotistical benefit is just a hair away
> from saying that uploading serves no end whatsoever.
Yes, apart from the fact that I think a future me could live an excellent
life in a simulated world, as well as all the other uploaded beings, and
that it's a good thing in itself.
This argument is also why I don't see cryonics as pertinent in an
egotistical point of view. At best, it's like saying "have fun, future guy
that's like me!". But again, I feel the same for the future me that will
wake up tomorrow in my bed. Yes, this reasoning involves mind blowing
implications, but I don't see how it could be wrong.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20110906/fe09c0e3/attachment.html>
More information about the extropy-chat
mailing list