<DIV>Hi Lee,</DIV> <DIV> </DIV> <DIV>Lee writes:</DIV> <DIV> </DIV> <DIV>"First, it would be irrational (or at least not sensible) because<BR>where would you draw the line? Suppose I show you a little cube<BR>a decimeter on a side, and then I tell you that I've improved<BR>the figures above: I am now creating and destroying a sentient<BR>every nanosecond, and so am killing about a billion people per<BR>second. Is this really something that you---as you watch me <BR>calmly hold my little cube in my right hand---should really<BR>get horribly excited by?"</DIV> <DIV> </DIV> <DIV>Yes. </DIV> <DIV>And if I did not, it would indicate that somehow I had lost the last shred of my personal sense of morality.</DIV> <DIV> </DIV> <DIV>Lee writes:</DIV> <DIV> </DIV> <DIV>"The answer is that remember I am *creating* those people, giving<BR>them an entire luxurious nanosecond in which to enjoy their<BR>lives, their dreams, and hopes for the purpose
(before I destroy<BR>them). Shouldn't that go on the "good" side of the ledger?"</DIV> <DIV> </DIV> <DIV>No.</DIV> <DIV>Since we are already so deep into the hypothetical: If I "created" a Billion flesh and blood human infants and then dropped an H-bomb directly on top of them, such that they felt no pain at all before death, would that go on the "good" side of the ledger?</DIV> <DIV> </DIV> <DIV>Lee writes:</DIV> <DIV>"Well, it would still be small potatoes compared to the Trillions<BR>and Trillions of simulated humans that I would be running, or<BR>that the other 999,999 would be running. If as much good is <BR>being done by 999,999 out of every million people as harm is<BR>being done by 1, then, again, keep it in perspective."</DIV> <DIV> </DIV> <DIV>I still believe that ancestor simulations are themselves immoral. Now granted, not on the same level of immorality as a "Hell" program, but still immoral. At the very least, an ancestor simulation is
an extreme infringement on the freedom of the subjects. It's also denying its inhabitants from a higher quality of life that "real" people alone are supposedly entitled to. And don't even get me started on "invasion of privacy" - how could *any* invasion of privacy top that which would occur with a simulation? Let's not mince words, an ancestor simulation is slavery, plain and simple - I thought we had moved beyond that.</DIV> <DIV> </DIV> <DIV>Lee writes:</DIV> <DIV>"Yes, but as Joseph Stalin said, "the death of a single Russian<BR>soldier is a tragedy. But the deaths of millions are a statistic." "</DIV> <DIV> </DIV> <DIV>Now your comparing me to Stalin???</DIV> <DIV> </DIV> <DIV>C'mon Lee, isn't that a bit extreme? Is my position *really* that unreasonable?</DIV> <DIV> </DIV> <DIV>Best Wishes,</DIV> <DIV> </DIV> <DIV>Jeffrey Herrlich<BR><BR><B><I></I></B></DIV> <DIV><B><I>Lee Corbin <lcorbin@tsoft.com></I></B>
wrote:</DIV> <BLOCKQUOTE class=replbq style="PADDING-LEFT: 5px; MARGIN-LEFT: 5px; BORDER-LEFT: #1010ff 2px solid">Jeffrey H. asks<BR><BR>> Lee writes: "It seems that if I were run a software program<BR>> that created and destroyed a sentient every microsecond, then<BR>> after about a minute you would consider me the greatest mass-<BR>> murderer of all time. Is that true?"<BR>> <BR>> If by "sentient" you mean a "conscious" and vaguely humanoid<BR>> type being, then it would really pain me to see you or anyone<BR>> else do this. If you did do it, then what choice do I have but<BR>> to indeed consider you as "the greatest mass-murderer of all<BR>> time"? Why would this be an irrational conclusion?<BR><BR>First, it would be irrational (or at least not sensible) because<BR>where would you draw the line? Suppose I show you a little cube<BR>a decimeter on a side, and then I tell you that I've improved<BR>the figures above: I am now creating and
destroying a sentient<BR>every nanosecond, and so am killing about a billion people per<BR>second. Is this really something that you---as you watch me <BR>calmly hold my little cube in my right hand---should really<BR>get horribly excited by?<BR><BR>The answer is that remember I am *creating* those people, giving<BR>them an entire luxurious nanosecond in which to enjoy their<BR>lives, their dreams, and hopes for the purpose (before I destroy<BR>them). Shouldn't that go on the "good" side of the ledger?<BR><BR>Really, it's all very silly. Clearly no one is actually having<BR>any harm come to them. So what if a person briefly passes into<BR>and out of existence in a nanosecond? Instead of worrying about<BR>the fantastic numbers of "deaths", worry instead about happiness<BR>and suffering.<BR><BR>(I do agree with you that if I showed you a little cube where<BR>I created billions of sentients and were causing them nearly<BR>infinite agony, then you might very well wish to knock
the <BR>cube from my hand and stomp on it. But nothing very bad (or<BR>good) is happening under the case being described. So that's<BR>how you avoid considering me to be the greatest mass-murderer<BR>of all time.)<BR><BR>> Lee writes:<BR>> > "You should maybe think of what's bugging you this way: what<BR>> > are the odds that if you grant someone freedom they'll<BR>> > immediately conjure up a hell and conjure up millions of<BR>> > sophonts to agonize in it?"<BR><BR>> The odds? Don't know but I'll take a (conservative) wild guess:<BR>> maybe one in a Million. But, what is likely to be the world<BR>> population at the time of Singularity? 7 - 12 Billion? So,<BR>> maybe 7000 to 12000 people who would jump at this opportunity<BR>> if it was offered. Consider that in the distant future, a<BR>> *single* "bad" person could probably run a "Hell" program on<BR>> Trillions and Trillions of simulated humans. At how many<BR>>
multiples of Earth's population today would these total<BR>> murders constitute an atrocity? <BR><BR>Well, it would still be small potatoes compared to the Trillions<BR>and Trillions of simulated humans that I would be running, or<BR>that the other 999,999 would be running. If as much good is <BR>being done by 999,999 out of every million people as harm is<BR>being done by 1, then, again, keep it in perspective.<BR><BR>> My answer: It would become an atrocity with the first murder.<BR><BR>Yes, but as Joseph Stalin said, "the death of a single Russian<BR>soldier is a tragedy. But the deaths of millions are a statistic."<BR><BR>Lee<BR><BR>_______________________________________________<BR>extropy-chat mailing list<BR>extropy-chat@lists.extropy.org<BR>http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat<BR></BLOCKQUOTE><BR><p>__________________________________________________<br>Do You Yahoo!?<br>Tired of spam? Yahoo! Mail has the best spam protection around
<br>http://mail.yahoo.com