[extropy-chat] Are ancestor simulations immoral? (An attempted survey)

Samantha Atkins sjatkins at mac.com
Wed May 24 01:46:45 UTC 2006


On May 23, 2006, at 5:17 PM, Robert Bradbury wrote:

>
> Oh, it gets *far*, *far* worse than the proposals thus far  
> made...  :-;
>
> First, since its a simulation, its perfectly reasonable to do  
> whatever one feels like.  One may be running simulations whose  
> explicit purpose is to explore morality.  In which case one needs  
> to create situations where some people may experience pain, suffer,  
> die, etc.  Either we are in the bottom level "reality" or everyone  
> in the upper levels of reality have thrown away the "morality"  
> paradigm.  (This is the "If there is an all powerful God, why do  
> people have to suffer and die?" question dressed differently.)
>

Well, if you are going to bother to create a world and have beings in  
it (or that evolve in it) who have some ability to choose among  
alternatives and where choices have consequences then there will be  
suffering.  Any world you create will have some governing laws of  
physics or the equivalent even if you choose to change or violate  
them periodically.  If it has laws of physics, i.e., identity, and it  
has choosing agents then it will always be possible that the agents  
will make some choices that they suffer some consequences from.    It  
is not even a matter necessarily of "doing whatever one likes".

Also, I have a suspicion that any sufficiently powerful intelligence  
creates de facto simulations just in the act of mulling over  
scenarios due to the greater level of detail and scope involved.  If  
so then simulations naturally arise from the very thoughts and  
daydreams of sufficiently advanced minds.

I also suspect that one reason for creating a  historical sim is to  
tweak the factors involved as minimally as possible to get a  
different and better outcome.  This could be one way to learn more  
deeply from experience.

> Furthermore, once we have the ability to run simulations ourselves  
> (remember *worlds* of billions of RBs, LCs, JHs, etc. -- i.e. 10^10  
> copies worth of each isn't even stretching things) there doesn't  
> seem to be much one can do to prevent it.

I doubt this world or our current small selves are sufficiently  
interesting for such astronomical numbers of simulated copies to be  
at all likely.

>
> If I happen to spot an unused solar system nearby and say I am  
> going to colonize it and turn it into a massive simulation engine  
> to run billions and billions of RBs (some of whom are bound to  
> suffer horribly and die) then some holier than I people are going  
> to be bound and determined to prevent me from doing that.  The only  
> way that seems possible is to either (a) imprison me on the Earth  
> (or even in a very high security prison) or (b) take the nanobots  
> to my mind and force the removal of such "immoral" thoughts ( i.e.  
> state or equivalent "mind control").
>

Who says it is immoral?  Silly maybe, but immoral?

> The only way I can see to prevent this is heavily enforced state  
> restriction on access to the various technologies and capabilities  
> we regularly discuss on the list.  ( I.e. probably no lifespan  
> extension, no access to "personal" nanorobots (to build my rocket  
> ship), no ability to "program" bio/nano-matter (to manufacture the  
> materials for my rocket ship), standing shoot-to-vaporize orders on  
> anything unauthorized leaving the solar system (I'm not even sure  
> if you could prevent low level nanorobot "leakage" from the solar  
> system).   Etc.

I would rather risk suffering and dying, even countless times,  in  
some lunatic posthuman hell than allow some sophonts such unlimited  
power to forbid absolutely anything and everything that might be used  
somehow, sometime in a way they don't like.   That would be worse  
than hell for me.

- samantha 



More information about the extropy-chat mailing list