[extropy-chat] Are ancestor simulations immoral? (An Attempted Survey)

A B austriaaugust at yahoo.com
Fri May 26 19:36:34 UTC 2006


Hi Lee,
   
  Lee writes:
  "Well, Jeffrey, I have to add that I had the same impression that
Samantha did. In my words, you did seem to be arguing against a
position that I don't think anyone on this list takes."
   
  Okay, well perhaps I phrased it poorly. I was making a somewhat exaggerated example in order to show that the principle behind it is at least ethically questionable.
   
  Lee:
  "Surely you don't believe that everything that is bad
should be outlawed. Or do you?"
   
  No, absolutely not! I realize that "bad" doesn't have an objective definition. Lot's of people think that pre-marital sex is extremely "bad!" I think it's perfectly fine, for example. Like I said before, I think that anyone should be able to do *anything* with their own bodies, minds, and non-sentient property - and when I say *anything*, I mean *ANYTHING*  :-)  The only line I draw is murdering or torturing (or intentionally bringing harm to) other *conscious* beings.
   
  Lee, I'm a little bit confused by your reference to "Rule of Law". Could you elaborate for me on exactly what you are referring to? I can't really determine whether you mean that standard Laws are "good" or "bad", so I can't yet really comment on this section of your post.
   
  Lee writes:
  "It seems that if I were get a software program that created
and destroyed a sentient every microsecond, then after about
a minute you would consider me the greatest mass-murderer of
all time. Is that true?"
   
  Lee, of course I barely know you, but you seem like a reasonable, patient, and "good" person. If by "sentient" you mean a "conscious" and vaguely humanoid type being, then it would really pain me to see you or anyone else do this. If you did do it, then what choice do I have but to indeed consider you as "the greatest mass-murderer of all time"? Why would this be an irrational conclusion?
   
  Lee writes:
  "You should maybe think of what's bugging you this way: what
are the odds that if you grant someone freedom they'll
immediately conjure up a hell and conjure up millions of
sophonts to agonize in it?"
   
  The odds? Don't know but I'll take a (conservative) wild guess: maybe one in a Million. But, what is likely to be the world population at the time of Singularity? 7 - 12 Billion? So, maybe 7000 to 12000 people who would jump at this opportunity if it was offered. Consider that in the distant future, a *single* "bad" person could probably run a "Hell" program on Trillions and Trillions of simulated humans. At how many multiples of Earth's population today would these total murders constitute an atrocity? 
   
  My answer: It would become an atrocity with the first murder.
   
  Best Wishes,
   
  Jeffrey Herrlich 
  
Lee Corbin <lcorbin at tsoft.com> wrote:
  Jeffrey replies to Samantha, but asks questions too delicious for 
me to pass up!

> Samantha:
> > "This is a straw man that was not advocated."

> I didn't mean to claim that it was advocated, I was making my case.

Well, Jeffrey, I have to add that I had the same impression that
Samantha did. In my words, you did seem to be arguing against a
position that I don't think anyone on this list takes.

> Let me ask you a question:

> I assume that we agree that a "real" being and a conscious
> "simulated" being are both composed of hardware and software
> and that both exist at the "real" layer of "reality". Why
> should ending the life of a "simulated" being, be viewed
> any differently than a "real" being murdering another "real"
> being?

Here is a case of the same thing, I think. I don't believe that
anyone at all on this list would say that there is a difference. 

> Why should torturing a conscious "simulated" being, be viewed
> differently than a "real" being torturing another "real" being?
> The crimes are equivalent. 

Likewise---if you mean "moral crime".

> The only "factor" that would supposedly separate the status
> of a "real" being from the status of a "simulated" being is
> that the "real" one was born first and therefore supposedly
> deserves to wield ultimate power over the one that was born
> later.

It all depends. I do believe that you are ignoring differences
that are essential to me, and perhaps to the others. I suspect
that the core of the actual difference between our viewpoints
can be seen in your next statement:

> That's "messed up"; it's legally allowing murder and torture.

Bringing "The Law" into it is an *entirely* different can of
worms. Surely you don't believe that everything that is bad
should be outlawed. Or do you?

Almost *all* of our progress and all of the humanitarian 
improvements in the human condition the last ten thousand
years have stemmed from (1) Rule of Law, and (2) Protection
of Private Property. It is extremely hazardous, in my
opinion, not to treat these two principles with the utmost
respect.

As the 20th century showed, there is practically no limit
to the harm that results from tampering with these principles,
tampering that is always accompanied, of course, by the best
of intentions.

As Samantha said, it may be non-trivial to determine the
extent of consciousness of a character in a video game.
I suspect that the difficulty can be arbitrarily great;
i.e., it could be arbitrarily high up in the complexity
classes, beyond NP-complete.

So we come to the classic question: Who is to decide?

The totalitarian answer is that all power should be in the
hands of the people, i.e., in the hands of their elected
or non-elected representatives. In other words, the government
must decide what actions you take are ethical and moral, and
which are not.

But the evolved solution, namely (1) Rule of Law, and (2)
Protection of Private Property, is far less ambitious. It
recoils from the idea that wisdom can be concentrated in a
single place (e.g. the Supreme Court) or anywhere in fact,
that is not *local*.

Thomas Sowell explains all this with fantastic clarity in
his books, such as "Decisions and Knowledge". The greatest
thinkers of the past, e.g. Von Mieses and Hayek, were perhaps
the first to deeply understand, but an "in practice" understanding
was also achieved by America's founders, who legislated tremendous
*restrictions* on what higher bodies could do to lower ones.

Therefore the instinctive recoil of people like Samantha, who
are extraordinary leery of having a body intercede in her affairs
to determine the complexity of her software and whether or not
she's doing the "right" thing with it is well-founded.

I do not happen to have such a strong aversion as many like
she do; but still, I can fully understand on *principle* that
it is best for someone to be able to determine what happens
inside their own minds and inside their own property, without
an outside regulator poking around.

It seems that if I were get a software program that created
and destroyed a sentient every microsecond, then after about
a minute you would consider me the greatest mass-murderer of
all time. Is that true?

You should maybe think of what's bugging you this way: what
are the odds that if you grant someone freedom they'll
immediately conjure up a hell and conjure up millions of
sophonts to agonize in it?

Lee

_______________________________________________
extropy-chat mailing list
extropy-chat at lists.extropy.org
http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat


		
---------------------------------
Do you Yahoo!?
 Everyone is raving about the  all-new Yahoo! Mail Beta.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20060526/1af57ef2/attachment.html>


More information about the extropy-chat mailing list