[extropy-chat] Identity (was: Survival tangent)

Lee Corbin lcorbin at rawbw.com
Sun Nov 5 22:17:29 UTC 2006


Randall writes

> On Nov 5, 2006, at 1:44 PM, John K Clark wrote:
>> If I place you (the copy) and the original an
>> equal distance from the center of a symmetrical room so you see the
>> same things and then instantly swap your bodies position with the
>> original then neither you nor the original nor any outside observer
>> could detect the slightest change. There was no change because
>> although there were 2 bodies in the room there was only one person.
> 
> ...
> But here's a question for you: in a Tegmark universe, as I
> understand it, there are an infinite number of John K Clark
> bodies, widely separated by space, but in your view all with
> an equal claim to being *you*, right?  (In a MWI universe,
> the same is true but without the spacial separation).  So,
> when you walk across the street, why dodge a car that almost
> hits you?
> 
> Lee Corbin would say, I believe, that the important thing is
> increase Lee-Corbin-runtime, and that this dictates saving
> this particular Lee-Corbin-process,

Right you are!  Proof that we *do* understand each other,
occasionally  :-)

> but I don't think this mild preference (for the actual runtime
> lost by losing this particular process would be infinitesimal)
> explains the great lengths that I imagine you guys would go
> to if this process' runtime were in danger.

Here is a scenario.  There are a million copies of me in the
inner solar system, all having been made from the original
in the last ten minutes and have been placed into identical
or near-identical physical circumstances.  But there is also
an understanding that one of them must be eaten by a 
Bengal tiger!

It is not efficient for each to worry about a 1 in 1,000,000
possibility, so our minds turn to other things. But in one
place, a Bengal tiger leaps into the room.  By sheer reflex,
that copy will try to escape and will no doubt be terrified.

But I say that these are only "lower-order" aspects of one,
and are not representative of who I truly am. If the scenario
becomes less graphic, and one of them must press a button
and be disintegrated, then all of us would be indifferent as to
who did so.  If the button were in the room, we'd all reach
for it, with the understanding that the last 999,999 would
not be disintegrated.  No instance would actually care a
whit.

> If you don't agree that the runtime lost would indeed be
> infinitesimal,

But it's not infinitesimal:  it's one whole unit of John Clark or
Lee Corbin.  

> we can introduce a random number generator based on
> decay rates or some other apparently random source,
> and then reason only about the reactions of the one in a
> zillion John K Clark or Lee Corbin.

Not sure I understand, but I'll give it a swing.  Okay, in 
each of the rooms where I've been copied, we are each
subject to hearing a bong go off that says that our
1/1,000,000 chance from the random number generator
actually happened, and the individual instances who 
hear the bong know that they're about to be zapped.
We'd each be unhappy that *so many* of us were 
unlucky, even if it was only one.  But none of us (speaking
for myself only, not John) would feel his life imperiled in
the slightest.

Lee





More information about the extropy-chat mailing list