[extropy-chat] Anticipation and Benefit (was Re: The Anticipation Dilemma)

Lee Corbin lcorbin at rawbw.com
Mon Apr 16 03:40:21 UTC 2007

Stathis writes

> I could say I am more concerned about my current self, due to
> the anticipation issue.

I take it to mean that you sanction choices that are illogical by
those of us who view copies as selves.  That is, you intend to
make decisions based upon your (inconsistent) anticipation of 
how "you" will feel or be later, as opposed to making decisions
that some of us would argue are for your greater total benefit.

> In the long term if duplication becomes commonplace those
> people who count copies as selves will prevail, but this sort
> of argument doesn't necessarily determine what we should do...

Right.  I can accede to John Clark's pointing out that entities
which think memories of a particular human are nothing special
will prevail, and Damien could accede to teleporters having
an evolutionary edge in the future. Yet this doesn't change the
perception by us that the suggested changes actually kill us.

> I consider it "irrational" that I am concerned about the welfare
> about some guy tomorrow who thinks he is me, has my
> memories and my possessions.

Yes;  our eternal divide on this list that suggests to some that
we never make progress  :-)



> If he travelled to today in a time machine and cleaned out
> my/his bank account, I'd be upset.

I might be too, unless I granted him the benefit of the doubt
that something very strange is afoot.  Otherwise, I know that
my duplicate would take the benefit of *this* instance into
account too, unless, as I say, extraordinary circumstances
were somehow prevailing.

> I say that none of the other copies are really me, because
> if we were all in a sinking ship together each of us would
> fight for the last place on the lifeboat.

In the lifeboat, my instances (e.g. "I") would all be thinking
about how to maximize the number of survivors, yet after
the waters begin to close over our heads, our animal instincts
take over.  I don't identify with those particular instincts, and
will delete them if ever given an opportunity.

So, just to be clear and for the record, an instance of you
given the choice between saving itself xor saving an extremely
recent duplicate who would be able to deposit $10M to the
S. P. account, that instance of you would choose for itself
to survive?

More information about the extropy-chat mailing list