[extropy-chat] Are ancestor simulations immoral? (An attempted survey)

Lee Corbin lcorbin at tsoft.com
Wed May 24 03:58:59 UTC 2006


Robert writes

> First, since its a simulation, its perfectly reasonable to do
> whatever one feels like.

Well, the question was whether or not it is *moral*. That is,
let's take it for granted that it is immoral to upload a kitten
---or emulate a never previously existing one---and proceed to
hideously torture it.    I say that that's immoral, and I say
that you shouldn't do it, (i.e., I disapprove).

> One may be running simulations whose explicit purpose is to
> explore morality.  In which case one needs to create situations
> where some people may experience pain, suffer, die, etc.  Either
> we are in the bottom level "reality"

oh, what difference could it make what level it's on?  so long
as we are talking about verisimilar emulations?

> or everyone in the upper levels of reality have thrown away the
> "morality" paradigm.

Sorry---you've lost me. Do you mean higher levels of embedded simulation?

> Furthermore, once we have the ability to run simulations 
> ourselves (remember *worlds* of billions of RBs, LCs, JHs, 
> etc. -- i.e. 10^10 copies worth of each isn't even 
> stretching things) there doesn't seem to be much one can 
> do to prevent it. 

If you're in some solar system real estate that "belongs"
to a very competitive AI, then it may force you to be nice.
So it may lay down some stupid rule that says no matter how
pleasant 99x10^8 copies of RB have it, the remaining 10^8
copies may *not* have unpleasant experiences of any kind.
We know people who intend for their AI to do exactly this.

> If I happen to [simulate] billions and billions of RBs
> (some of whom are bound to suffer horribly and die)
> then some-holier-than-I people are going to be bound and
> determined to prevent me from doing that.  The only way
> that seems possible is to either (a) imprison me... or
> (b) take the nanobots to my mind and force the removal
> of such "immoral" thoughts ( i.e. state or equivalent
> "mind control").

I believe that they would prefer for you to simply be frustrated
in your designs. Much as people who today would run cock-fights
are discouraged by contrary laws (and we are talking about 
entities who could enforce such with much greater efficiency
than today).

> To me it looks like the only way to "enforce" moralities
> of the form "thou shalt not make others (real or virtual)
> suffer" is to force the person who would think such thoughts
> (and presumably act upon them) to suffer instead. 

Yes, but they suppose, rightly I think, that the "suffering"
you endure because you can't freely simulate whom you please
is dwarfed by the unpleasantness experienced by some of those
you would simulate.

Lee

P.S. to those to whom it sounds like I've contradicted myself.
Please remember that I may strongly disagree with your action
yet believe that you should have the legal right and the 
physical ability to do it.



More information about the extropy-chat mailing list