[extropy-chat] Are ancestor simulations immoral?
A B
austriaaugust at yahoo.com
Tue May 30 13:57:39 UTC 2006
Hi Lee,
Lee writes:
"First, it would be irrational (or at least not sensible) because
where would you draw the line? Suppose I show you a little cube
a decimeter on a side, and then I tell you that I've improved
the figures above: I am now creating and destroying a sentient
every nanosecond, and so am killing about a billion people per
second. Is this really something that you---as you watch me
calmly hold my little cube in my right hand---should really
get horribly excited by?"
Yes.
And if I did not, it would indicate that somehow I had lost the last shred of my personal sense of morality.
Lee writes:
"The answer is that remember I am *creating* those people, giving
them an entire luxurious nanosecond in which to enjoy their
lives, their dreams, and hopes for the purpose (before I destroy
them). Shouldn't that go on the "good" side of the ledger?"
No.
Since we are already so deep into the hypothetical: If I "created" a Billion flesh and blood human infants and then dropped an H-bomb directly on top of them, such that they felt no pain at all before death, would that go on the "good" side of the ledger?
Lee writes:
"Well, it would still be small potatoes compared to the Trillions
and Trillions of simulated humans that I would be running, or
that the other 999,999 would be running. If as much good is
being done by 999,999 out of every million people as harm is
being done by 1, then, again, keep it in perspective."
I still believe that ancestor simulations are themselves immoral. Now granted, not on the same level of immorality as a "Hell" program, but still immoral. At the very least, an ancestor simulation is an extreme infringement on the freedom of the subjects. It's also denying its inhabitants from a higher quality of life that "real" people alone are supposedly entitled to. And don't even get me started on "invasion of privacy" - how could *any* invasion of privacy top that which would occur with a simulation? Let's not mince words, an ancestor simulation is slavery, plain and simple - I thought we had moved beyond that.
Lee writes:
"Yes, but as Joseph Stalin said, "the death of a single Russian
soldier is a tragedy. But the deaths of millions are a statistic." "
Now your comparing me to Stalin???
C'mon Lee, isn't that a bit extreme? Is my position *really* that unreasonable?
Best Wishes,
Jeffrey Herrlich
Lee Corbin <lcorbin at tsoft.com> wrote:
Jeffrey H. asks
> Lee writes: "It seems that if I were run a software program
> that created and destroyed a sentient every microsecond, then
> after about a minute you would consider me the greatest mass-
> murderer of all time. Is that true?"
>
> If by "sentient" you mean a "conscious" and vaguely humanoid
> type being, then it would really pain me to see you or anyone
> else do this. If you did do it, then what choice do I have but
> to indeed consider you as "the greatest mass-murderer of all
> time"? Why would this be an irrational conclusion?
First, it would be irrational (or at least not sensible) because
where would you draw the line? Suppose I show you a little cube
a decimeter on a side, and then I tell you that I've improved
the figures above: I am now creating and destroying a sentient
every nanosecond, and so am killing about a billion people per
second. Is this really something that you---as you watch me
calmly hold my little cube in my right hand---should really
get horribly excited by?
The answer is that remember I am *creating* those people, giving
them an entire luxurious nanosecond in which to enjoy their
lives, their dreams, and hopes for the purpose (before I destroy
them). Shouldn't that go on the "good" side of the ledger?
Really, it's all very silly. Clearly no one is actually having
any harm come to them. So what if a person briefly passes into
and out of existence in a nanosecond? Instead of worrying about
the fantastic numbers of "deaths", worry instead about happiness
and suffering.
(I do agree with you that if I showed you a little cube where
I created billions of sentients and were causing them nearly
infinite agony, then you might very well wish to knock the
cube from my hand and stomp on it. But nothing very bad (or
good) is happening under the case being described. So that's
how you avoid considering me to be the greatest mass-murderer
of all time.)
> Lee writes:
> > "You should maybe think of what's bugging you this way: what
> > are the odds that if you grant someone freedom they'll
> > immediately conjure up a hell and conjure up millions of
> > sophonts to agonize in it?"
> The odds? Don't know but I'll take a (conservative) wild guess:
> maybe one in a Million. But, what is likely to be the world
> population at the time of Singularity? 7 - 12 Billion? So,
> maybe 7000 to 12000 people who would jump at this opportunity
> if it was offered. Consider that in the distant future, a
> *single* "bad" person could probably run a "Hell" program on
> Trillions and Trillions of simulated humans. At how many
> multiples of Earth's population today would these total
> murders constitute an atrocity?
Well, it would still be small potatoes compared to the Trillions
and Trillions of simulated humans that I would be running, or
that the other 999,999 would be running. If as much good is
being done by 999,999 out of every million people as harm is
being done by 1, then, again, keep it in perspective.
> My answer: It would become an atrocity with the first murder.
Yes, but as Joseph Stalin said, "the death of a single Russian
soldier is a tragedy. But the deaths of millions are a statistic."
Lee
_______________________________________________
extropy-chat mailing list
extropy-chat at lists.extropy.org
http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
__________________________________________________
Do You Yahoo!?
Tired of spam? Yahoo! Mail has the best spam protection around
http://mail.yahoo.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20060530/2e8cfbd3/attachment.html>
More information about the extropy-chat
mailing list