[ExI] People are Genuine Altruists, Sociopaths, or Confused/Moody
lcorbin at rawbw.com
Tue Sep 16 04:02:28 UTC 2008
Isabelle writes (welcome, Isabelle!)
> I've thought about it, and I disagree for another reason... I think
> that if you found out that in fact, you were the only living person
> and then you DIDN'T change your behavior that THAT would mean you are
> altruistic. You would /definitively/ know you would not receive
> something back, because they are not real, and yet you would still
> act with kindness.
Perhaps we are miscommunicating about what it means "to
get something back". I refer to the usual conventions in daily
life where if you are nice to people, they are nice back. If
you are nasty to most people, they'll either avoid you or
be nasty back. Therefore, in a simulation that was *perfect*,
i.e., other people are controlled by the puppet-master to
behave just as they would behave in real life, you had better
be nice to people (for your own good) or the puppet-master
will naturally simulate retaliation on their part. Does this make
what I am saying any clearer?
> That would mean your altruistic tendencies are
> genuine and natural. Lots of people are not genuinely altruistic.
> They may be acting in an altruistic manner simply because
> that is how they want to be perceived by others, or because
> they believe in karma, or because it makes them FEEL GOOD,
> all of which are benefits to self.
But would it really make you feel better to be altruistic
towards people *in a simulation*? And now I am
addressing cases of genuine altruism, which is the
term I use to exclude all those payback cases:
namely, a, b, and c, of the former emails.
Take the case of being nice to someone in traffic (who by
hypothesis cannot possibly pay you back because
of the size of the metropolitan district, or you are
a foreigner, or something). Why ever be nice in a way
that was not immediately self-rewarding? In real life,
one may do so because it is in one's nature, or one
has a conscience, or (as has been argued here by
others) one is principled, one wishes to be fair, etc.
That is, in a simulation, I myself would cure my habit
of sometimes letting cars go ahead of me (in cases
where there is no danger to myself or anything).
Since I am the only consciousness in the simulation,
it makes no sense for me to be nice. So I would *change*.
Shouldn't you too? I would advise it. For in those cases
being nice benefits no one, and actually hurts someone
> All of which technically would not be altruistic.
> But to be kind to virtual people simply because
> it is in your nature to be kind, even if they are
> not real, and cannot appreciate it, nor benefit from it...
> well, that sounds more like a test of altruism to me.
For initial reactions, I agree. At first, I would continue
to be nice. But then I'd start thinking about it and
start saying to myself "Hey, you're only causing
yourself delay or inconvenience, and *no one*
is benefiting, so stop it, Lee."
Your point "because it is in your nature to be kind"
is very important, I think. I believe genuine altruism
to be built-in at the genetic level, whereas reciprocal
altruism or kin-selection, however they arise or arose,
is altruism of a lower order, and does not qualify
as "genuine altruism". Does that make sense? (See also
> PS, I did not seem to get the first in this line of conversation...
> can someone forward it to me, or send me a link? Thanks -Isabelle
This thread began on August 30:
> [Lee wrote]
>> (2) if it was revealed to you that you were living in a simulation
>> wherein you were the only conscious person, and everyone
>> else merely a puppet under the manipulation of a cold,
>> distant, infinitely calculating entity who had no emotions
>> whatsoever... would your behavior towards others change
>> at all?
>> If you can answer yes to either (1) or (2), you possess genuine
[Damien B. wrote]
> This has got to be wrong, and suggests a flaw in your definition of
> altruism (which I think must embody a benefit to some other person
> with interests) [Yes].
Well, I agree with that, and with any implications of it that I can
think of. [I was agreeing only with the characterization of "altruism"]
> If you treat your toaster well, are you more altruistic than someone
> who never cleans it or even smashes it on the bench when it burns
> the toast?
Certainly not. I am claiming that if you *did* find yourself living in the
kind of simulation described above, and your behavior *did* change
as a result, then you are a genuine altruist since there are now in your
presence no feeling or conscious entities whatsoever who your actions
can affect. Thus there is no longer the behavior evinced by the genuine
altruist, e.g. leaving tips in restaurants he or she will never visit
again, and letting people go ahead of you in traffic.
For example, it took some time, but I finally managed to prove (at least
to myself) that I am a genuine altruist because I *do* let people out of
crowded parking lots ahead of me, and I *do* leave tips in restaurants
I know I'll never visit again, and I would immediately stop doing that
in a simulation where I was the only genuine person. I thought of the
scenario (which I call the "VR-Solipsist") as a means to determine
whether or not I was a genuine altruist.
 Altruism based either upon kin selection or reciprocation, though
very real, very powerful, and very beneficial to our world, does not
count as what is here being called "genuine altruism".
More information about the extropy-chat