[ExI] Next moment, everything around you will probably change

Jef Allbright jef at jefallbright.net
Thu Jun 21 17:35:06 UTC 2007

On 6/21/07, Lee Corbin <lcorbin at rawbw.com> wrote:

> so similarity is a pretty good criterion for t = 0.0001 seconds post
> duplication. Well, my point is clear now.  I just don't see any problem
> coming up with the similarity criterion for any t, just so long as there
> is a *high* degree of similarity.  In short, two entities are the same
> person if they are similar enough on the microscopic level.
> > The point here, and we've been around this loop before, is that you
> > say physical/functional similarity tends to diminish with time and
> > that beyond some point one is no longer the same person, but your
> > theory of similarity-based personal identity doesn't say anything
> > about how much similarity or how it diminishes. Your theory is
> > incomplete, it only accounts for a special case.
> Yes, good.  I agree.  It's a very rough idea.  But maybe it's not the
> fault of the "similarity theory" because we could make the same
> criticism of any kind of lumping into categories that mammals do.
> The universe itself can perhaps be blamed, for so easily giving
> rise to categories.

You've been polishing your thinking about personal identity based on
physical/functional similarity for decades and it allows you to see
that duplicates can be **exactly** the same person.  That idea isn't
"rough", nor is there any "fault" or "blame"; it's just incomplete.

My point involves the understanding that categories don't "exist"
ontologically, they are always only a result of processing in the
minds of the observer(s).

> >> I go with similarity on many, many other
> >> things.  Leibniz even elevated to a principle "Identity of Indiscernables"
> >> in a somewhat related context.  Hot days are like other hot days,
> >> dependent on similarity of structure (along one dimension). Two
> >> rabbits are considered to be of the same species not because their
> >> DNA is exactly equivalent, but because of close similarity.  In such
> >> ways we categorize almost *everything*, so similarity is pretty
> >> universal and powerful (judging by the success of Darwinian creatures
> >> who employ it, e.g., a gazelle that lumps all lions into a single deadly
> >> category).
> >
> > Similarity is not the problem.  The point is that with regard to
> > personal identity, similarity in terms of agency is more coherent and
> > extensible than similarity in physical/function terms.
> > ...
> > I already gave you the example of the two near-identical duplicates in conflict.
> But as I said, the similarity metric says that they're the same person,
> and a wife of one wouldn't tell the difference between the two, and
> so on. In other words, they seem in all ways to be the same person.
> They just hate each other is all.  (And that's hardly novel:  we often
> wonder if a given individual "hates himself" in some way.)

You appear here to neglect your own belief and arguments that physical
substrate doesn't matter at all (I completely agree, of course), and
that what matters is function, defined of course by the agent's
physical structure (and by something else of which you appear
consistently unaware.)  How correct can it be to refer to separate
instances as being the same person on the basis of their almost exact
physical similarity, when their functioning recognizes the separate
existence of the other, so as to hate, compete with or even destroy
the other?  (Or even to cooperate.)

There's a very practical point to this philosophizing (and it's not
about personal survival.)  I've stated it twice now, even highlighted
it with "---------------", and twice you've deleted it without

>>... any degree of selfishness will tend to put
>> duplicates at odds with one another as they interact from within
>> increasingly disparate contexts.

If we can get past the polemics we can consider the more interesting
(in a practical sense) issues of systems of competition and
cooperation, which is necessarily between **agents**.


> Okay, since you have been so kind to cut and paste it again, I
> will try to answer it as directly as I can.  I *don't* see those
> as two separate individuals at all.  Neither would the people
> who know them.

People have no difficulty with biological twins being different
persons, despite very high physical/functional similarity.  You would
probably like to say this is because each twin has different memories,
etc., but consider that other people can't see memories, etc., what
they see is separate agency.

>  I admit that you have one good point here:
> namely, that as they fought, they'd see themselves as separate
> people.


> But I say that they are simply mistaken:  it's as though
> each has been programmed by nature to regard anything
> outside its own skin as "the other" or as "alien".  I mean, we
> could have to *totally* identical instances of the Tit-For-Tat
> program playing each other (or rather a minor variation of
> Tit-For-Tat that tried a random defection now and then),
> and they naturally behave as though they are going up against
> "the other", "the alien", the "other player".  Yet they are truly
> identical, right down to the last statement of code.

You're arguing circularly again, assuming your own conclusion.

This topic gets more interesting when we get past this (temporary?)
impasse and consider the application of artificial agents of arbitrary
physical/functional similarity fully dedicated to acting on behalf of
a single entity -- variously enabled/limited instances of exactly the
same self.

> > The point here is to show that despite extreme similarity, a pair of
> > duplicates can easily fall into conflict with each other. This
> > conflict can be over property, relationships, legal responsibility; in
> > essence these are conflicts over rightful identity -- a paradox if, as
> > you claim, they are necessarily the same identity due to their
> > physical/functional similarity.
> >
> > Or maybe simpler for you, consider the two duplicates, each with
> > identical intent to prevent the existence of the other. If, as you
> > say, physical/functional similarity determines personal identity, then
> > do you see the paradox entailed in a person trying to destroy himself
> > so he can enjoy being himself?
> I admit that there is irony in the situation of a person or program trying
> to destroy instances that are identical to itself,


Paradoxical if you insist that they **must be** the same person.

Ironical if you see that personal identity consists in the mind of the
observer(s), but expect them to act as one.

Natural if you see them as separate agents, and that any degree of
selfishness will tend to put duplicates at odds with one another as
they interact from within increasingly disparate contexts.

> even though it has been
> programmed to safeguard "its own existence".  But I consider the
> programs or persons acting in such a fashion to simply be deeply mistaken.
> All *outside* observers who are much less biased see them as
> identical.  Why aren't they identical?  Why should we view them as
> separate *people* or separate *programs* just because they're at
> each other's throats?

Why should outside observers see two competing twins, no matter how
physically/functionally similar, as the same person?  Should they
treat the offensive software engineer Lee exactly the same as the
defensive chess playing Lee? Were talking about separate agencies here
and **for all** practical purposes, and **for all** observers
(including these agents themselves), they are separate persons.

Even to get them to cooperate, which should be the desired outcome,
they must necessarily see themselves first as independent agents, and
then as the same person only to the extent that they are seen as
representing a single (abstract) entity known as Lee.

> > Or back to the biological organism manifesting Disassociative Identity
> > Disorder.  I said this supported my point, and you said "thanks"
> > without further comment.  In such a case we can agree that the
> > physical/functional similarity is total since it's only a single
> > organism, but we also agree that that any observer (including the
> > observers manifested by that particular organism) will see different
> > persons to the extent that they are perceived to act on behalf of
> > different entities.
> Hmm, well, we seem to have a hard disagreement here. Yes, let's
> consider just the case we/I have been discussing:  indeed there
> are many people who would hate their duplicates.

Are you evading here the case of the biological organism manifesting
DID, or are you conflating with the case of the duplicates?

> So let's suppose
> that A and A' are identical, and so---just as you say---they are
> what you call "different persons" because they are perceived as
> acting on behalf of different entities. Clearly here, they are acting
> on behalf of different *instances* of a what was a single person.
> You and I each beg the question in a different way.  You beg the
> question by saying that they are clearly different entities, and so
> are different people, and I say that (because of similarity) they
> are clearly the same person (or program).  How may we resolve
> this?
> Well, as above, I suggest that we consult outside authorities of
> higher reputation.

What?  Appeal to authority -- on the Extropy list?!

> If we send them into different rooms, can
> someone who knows them well tell them apart?  (I say no.)

Certainly they can be distinguished by someone who knows them well.
One of them goes on and on about how "this shouldn't be happening,
it's all just a deep mistake, how can I be so confused as to attack
myself like this, I just wanted to keep my software job and also get
to play more chess, there's no reason to be upset, I know beyond
logical doubt that my other instance should actually be anticipating
our increased pleasure, I know I'm a very reasonable person."  The
other keeps saying "of course this was bound to happen, I see it
clearly now even though I denied it when Jef tried to explain, I'm
depressed and burnt out with writing software and if things don't
change I'm going to do something...drastic."

> What if we administer the best personality tests that have been
> so far devised?  Will they show a difference? (Clearly no.)

Are you saying that personality traits have some direct bearing on
personal identity? We know that identical twins, separated at birth,
have a very high correlation so you mean to say that to some extent
they should be considered practically the same person?  Or are you
saying that some extremely high correlation would indicate shared
personal identity?  Please describe how high this would have to be,
without any circular reference to your conclusion.

> So
> isn't it up to you to say *why* they are different people?

I have been!  I've tried to show you in terms of social/moral/legal
interaction, in terms of extensibility to future cases of agency as
self, in terms of the paradox of one being in (even deadly) conflict
with oneself, even in terms of parsimonious elimination of unnecessary
ontological entities.

> How
> can you avoid my insistence that in order for them to be different
> people they must be different in some *way*?  (I.e., sneaking in
> a similarity criterion.)

We've already gone around this loop in this very thread.  With regard
to personal identity, physical/function similarity is only a special
case.  Personal identity is about agency.  Different agency entails
different personal identity.  (You didn't appear sneaky there, but
perhaps a bit obtuse.)

> > Personal identity is about agency.  Physical/functional similarity is
> > only a special case.
> I still don't agree.  Are there other examples that can be offered?
> How about some other examples where *agency* is clearly key?
> Perhaps in daily life?

I did that earlier, showing how over time a person (Aging Alice)
changes and even spawns variants while maintaining the same agency.
You chose to interpret it as a person continually or repeatedly


Oh well.

Also, I'm still hoping for a thoughtful response from you with regard
to the case of the biological organism manifesting DID.

- Jef

More information about the extropy-chat mailing list