[ExI] any exact copy of you is you + universe is infinite = you are guaranteed immortality
Lee Corbin
lcorbin at rawbw.com
Sun Jun 17 22:54:09 UTC 2007
Eliezer writes
> Suppose I want to win the lottery. I write a small Python program,
> buy a ticket, and then suspend myself to disk. After the lottery
> drawing, the Python program checks whether the ticket won. If not,
> I'm woken up. If the ticket did win, the Python program creates one
> trillion copies of me with minor perturbations (this requires only 40
> binary variables). These trillion copies are all woken up and
> informed, in exactly the same voice, that they have won the lottery.
Now at this point your total runtime (congratulations!) is about
a trillion times normal. Your Benefit per UTC Minute is about
10^12 times normal.
> Then - this requires a few more lines of Python - the trillion copies
> are subtly merged, so that the said binary variables and their
> consequences are converged along each clock tick toward their
> statistical averages.
Sorry to hear about your decreased runtime and decreased benefit.
Ah well, nothing great seems to last forever.
> At the end of, say, ten seconds, there's only one copy of me again...
>
> What's the point of all this? Well, after I suspend myself to disk, I
> expect that a trillion copies of me will be informed that they won the
> lottery, whereas only a hundred million copies will be informed that
> they lost the lottery. Thus I should expect overwhelmingly to win the
> lottery.
At this point you are depending on the notion of *anticipation*.
I have never been able to form a logical and self-consistent notion
of *anticipation* that accorded at all well with our intuitions. For
example, it is possible to end up having to "anticipate" things that
occurred in the past.
> None of the extra created selves die - they're just
> gradually merged together, which shouldn't be too much trouble - and
> afterward, I walk away with the lottery winnings, at over 99%
> subjective probability.
I have believed for many decades that almost every time that
probability is invoked in identity threads, it is misused. For
example, suppose that you are to walk into Black Box A
wherein 999 duplicates of you are to be made. After the
duplicates are created, only one of you---picked at random---
is allowed to survive. Many might suppose that the chances
of surviving Black Box A is only 1/1000. But of course, that's
incorrect. The chance that you will walk out is exactly 1.
Suppose that I know that ten seconds from now a million
copies of me will be made, all the new copies somewhere
on the seashore. Then yes, I will be surprised to still be
here. That is, the one of me who is not at the seashore
will be surprised. But our feelings of surprise, anticipation,
and so on, cannot so far as I know be reduced to a rational
basis.
> I mention this to show that the question of what it feels like to have
> a lot of copies of yourself - what kind of subjective outcome to
> predict when you, yourself, run the experiment - is not at all
> obvious.
Not only would I agree, but I go on to assert that our normal,
daily, usual feelings of anticipation, dread, surprise, apprehesion,
and other feelings of subjective probability having to do with identity
cannot be put upon an entirely rational basis.
> And the difficulty of imagining an experiment that would
> definitively settle the issue, especially if observed from the
> outside, or what kind of state of reality could correspond to
> different subjective experimental results, is such as to suggest
> that I am just deeply confused about the whole issue.
If you just look at Eliezer-runtime, and don't try to rationalize
anticipation and subjective probability, it seems to me that we
know all the facts in any given scenario, and cannot really be
said to be confused about anything.
Lee
> It is a very important lesson in life to never stake your existence,
> let alone anyone else's, on any issue which deeply confuses you - *no
> matter how logical* your arguments seem. This has tripped me up in
> the past, and I sometimes wonder whether nothing short of dreadful
> personal experience is capable of conveying this lesson. That which
> confuses you is a null area; you can't do anything with it by
> philosophical arguments until you stop being confused. Period.
> Confusion yields only confusion. It may be important to argue
> philosophically in order to progress toward resolving the confusion,
> but until everything clicks into place, in real life you're just screwed.
More information about the extropy-chat
mailing list