[extropy-chat] The Anticipation Dilemma (Personal Identity Paradox)
Stathis Papaioannou
stathisp at gmail.com
Sun Apr 15 05:09:58 UTC 2007
On 4/14/07, Lee Corbin <lcorbin at rawbw.com> wrote:
>
> Stathis writes
>
> > Let's summarise. You feel that the sort of anticipation which tells the
> > average human that he won't have the experiences of his copy in the
> > next room cannot be rationally justified and should be expunged.
>
> Yes, since if one is going to anticipate *any* future experience, then
> as a person and his recent duplicate are physically identical in all
> important respects, one should anticipate being *both* of the future
> systems.
>
> > On the other hand, you feel that the sort of anticipation which makes
> > the average human worry more about the future than the past cannot
> > be rationally justified but should be left alone. Is there an
> inconsistency here?
>
> Well, thanks for pointing this out. Yes, there is an inconsistency, but
> I'll
> try to minimize it. As much as *logically* these extreme thought
> experiments
> show that one should anticipate what has already happened to one as much
> as anticipate what is going to happen to one, perhaps there just isn't any
> payoff for doing so? That is, my anticipation module makes me drool over
> a pleasant even upcoming tomorrow night, but I only have fond memories
> of the same kind of event that happened to me last week, and they're not
> the same thing. Moreover, so far as *choices* are concerned, I can
> very, very seldomly do anything about the past. But determining whether
> my duplicate will get $10M and deposit in our account is important.
But my anticipation module makes me worry more about what happens to me than
what happens to my copy in the next room, in the same way as I worry more
about the future than the past. In fact, there is a sense in which my
relationship to copies of me in the past is the same as my relationship to
copies of me in the next room or in parallel universes that I can't access.
Even if I could change things so that in some alternate history things
worked out better for me, I wouldn't thereby anticipate the past more.
I have agreed with you all along that this sort of thinking is not always
rational and consistent, but there is no universal law saying that our
feelings have to be rational and consistent. There is no rational reason why
I should wish to survive in any capacity at all; it's just that humans have
evolved with the strong desire to survive. We could imagine an AI that was
perfectly rational and yet had no qualms about terminating its existence (in
fact, that would be the best sort of AI to build: we wouldn't want
super-beings around who cling to life as tenaciously as humans do).
Stathis Papaioannou
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20070415/330228a6/attachment.html>
More information about the extropy-chat
mailing list