[ExI] The Anticipation Dilemma (Personal Identity Paradox)

Lee Corbin lcorbin at rawbw.com
Tue Jul 17 10:20:09 UTC 2007


Stathis writes

> Memory erasure, for one, presents problems for the anticipation
> criterion for survival. To be consistent, I would have to say that
> memory erasure is equivalent to death, and that if I'm not worried
> about memory erasure then I shouldn't be worried about death.

Yes, thanks for seeing that and saving me the trouble with coming
up with a scenario. (Though if anyone else is doubtful, I'll be happy
to provide one.)

In your medical practice, have you yourself been under the influence
of midazolam?  (Sorry if we've had this discussion before---I really
don't keep as straight as I should who has said what  months ago.)
Now the *act* of taking midazolam, of course, is worry-free, 
since future versions of you will be memory-supersets.  But an
hour later, the creature "you" possibly may believe that he's going
to die. Is that the case?

> Equivalently, I could say that if I'm not worried about dying as long
> as my copy in the next room lives, then I shouldn't be worried about
> dying at all, or at worst I shouldn't be worried about dying as long
> as there is someone to continue my projects after I'm gone.

So let's say that you have terminal cancer (heaven forbid), and
are going to die in three months, and it so happens that a copy
of you was made four months ago, and frozen, and can be cured.
If I understand correctly, you do not believe that you will survive
in this scenario. Therefore, do you care whether your duplicate
is defrosted and continues your projects, or a brand new (and
very energetic and thorough) person is found in the unemployment
lines who will capably continue your projects?   (And let's leave
out, for convenience, any familial attachments and so on that are
in principle irrelevant.)

> Anticipating the next moment is what evolution has programmed
> us with to consider survival. If you start messing with that
> programming, you could as easily redefine survival to mean
> anything else, such as survival of a copy despite your death
> or survival of the human race despite your death and the
> death of all your copies.

Yes.  In fact any "editing" of my instincts and unconscious
processes---not to say conscious processes and capabilities
---needs to be done with the utmost caution.

In my daydreams of one day being resurrected by advanced
nanotechnology in the year 2061, I fully hope to have a vastly
intelligent robot (real or uploaded) at my side who can guide
me in my decisions.  "Uh, you might not want to mess with
that right now, Lee, while your IQ is still less than 200 and
your hippocampus has not been adequately augmented.
Recall what your friend Stathis was talking about back in
2007?  He was quite right."

Lee




More information about the extropy-chat mailing list