[extropy-chat] Are ancestor simulations immoral?

Lee Corbin lcorbin at tsoft.com
Sat May 27 17:35:07 UTC 2006


Jeffrey H. asks

> Lee writes: "It seems that if I were run a software program
> that created and destroyed a sentient every microsecond, then
> after about a minute you would consider me the greatest mass-
> murderer of all time. Is that true?"
> 
> If by "sentient" you mean a "conscious" and vaguely humanoid
> type being, then it would really pain me to see you or anyone
> else do this. If you did do it, then what choice do I have but
> to indeed consider you as "the greatest mass-murderer of all
> time"? Why would this be an irrational conclusion?

First, it would be irrational (or at least not sensible) because
where would you draw the line?  Suppose I show you a little cube
a decimeter on a side, and then I tell you that I've improved
the figures above:  I am now creating and destroying a sentient
every nanosecond, and so am killing about a billion people per
second.  Is this really something that you---as you watch me 
calmly hold my little cube in my right hand---should really
get horribly excited by?

The answer is that remember I am *creating* those people, giving
them an entire luxurious nanosecond in which to enjoy their
lives, their dreams, and hopes for the purpose (before I destroy
them).  Shouldn't that go on the "good" side of the ledger?

Really, it's all very silly. Clearly no one is actually having
any harm come to them. So what if a person briefly passes into
and out of existence in a nanosecond?  Instead of worrying about
the fantastic numbers of "deaths", worry instead about happiness
and suffering.

(I do agree with you that if I showed you a little cube where
I created billions of sentients and were causing them nearly
infinite agony, then you might very well wish to knock the 
cube from my hand and stomp on it. But nothing very bad (or
good) is happening under the case being described. So that's
how you avoid considering me to be the greatest mass-murderer
of all time.)

> Lee writes:
> > "You should maybe think of what's bugging you this way: what
> > are the odds that if you grant someone freedom they'll
> > immediately conjure up a hell and conjure up millions of
> > sophonts to agonize in it?"

> The odds? Don't know but I'll take a (conservative) wild guess:
> maybe one in a Million. But, what is likely to be the world
> population at the time of Singularity? 7 - 12 Billion? So,
> maybe 7000 to 12000 people who would jump at this opportunity
> if it was offered. Consider that in the distant future, a
> *single* "bad" person could probably run a "Hell" program on
> Trillions and Trillions of simulated humans. At how many
> multiples of Earth's population today would these total
> murders constitute an atrocity? 

Well, it would still be small potatoes compared to the Trillions
and Trillions of simulated humans that I would be running, or
that the other 999,999 would be running. If as much good is 
being done by 999,999 out of every million people as harm is
being done by 1, then, again, keep it in perspective.

> My answer: It would become an atrocity with the first murder.

Yes, but as Joseph Stalin said, "the death of a single Russian
soldier is a tragedy. But the deaths of millions are a statistic."

Lee




More information about the extropy-chat mailing list