[ExI] Affecting Past Experience (was The Reality of Categories)

Lee Corbin lcorbin at rawbw.com
Thu Jul 19 05:39:15 UTC 2007


Stathis wrote

(Sent: Tuesday, July 17, 2007 5:10 AM)


> In a block universe, there are my previous selves and my future
> selves, all doing their thing simultaneously (as it were). My past
> selves are about as similar to me as my future selves at an equivalent
> temporal displacement, but my future selves constitute a memory
> superset of my present self while my past selves do not. Therefore, I
> don't care if my past selves are wiped out by a giant
> trans-block-universe-thingy whereas I do care if my future selves
> suffer such a fate.

Of course, the whole idea of a block universe is that by being
deterministic, the whole thing exists "at once", and so it's hard
to make sense of past selves being wiped out by something
like a "trans-block-universe-thingy".  But I believe that indeed
sense *can* be made of such.  Perhaps this is what you are
referring to:

Let's say that an uploaded entity---born, raised, and educated
all in software (or I should say "coded-up, run, and having
incorporated a great number of facts")---finds itself in the
Newcomb's Paradox position.  Then I think that there is a
strong sense in which he is "free" to choose whether he 
actually had certain experiences, or merely had the memories
added artificially.

It's all deterministic, of course, but still his Newcomb choice
A (as opposed to choice B) may be correlated very highly
with whether he really did experience X or just suffered the
memories of experiencing X to added.  And by "highly
correlated", I mean that a much vaster entity who designed
and ran this entity arranged for the correlation to be 1.

(In other words, a vastly superior entity E+ designed entity
N so that N's choice---which is "free" if you ask N---will
be A if and only if N did actually experience X, and will be
B if not.   Now this isn't so easy, since N is supposed to
be the same exact entity whether or not his memories were
artificial.   But it could be done, so that N who chooses A
(and so got the real experiences) differs only infinitesimally
from N' who just got the memory enhancement.  This 
probably makes sense only if N is systematically confronted
by a whole sequence of A/B choices, and comes to learn
later after each choice whether the truth was that he actually
had X or actually had not experienced X.)

It is easier for me to assume that X is a positive experience
that N is eager to repeat, and which N hopes was a real
experience and not just a memory addition.  It's very much
as in Total Recall, where at after the Martian adventure is
over Arnold will always wonder whether it really happened
or the folks at Total Recall just added in the memories. We
may assume for convenience that Arnold would treasure
the experience having been real (even if somehow it had no
further repercussions in his life).

Lee




More information about the extropy-chat mailing list