[extropy-chat] Are ancestor simulations immoral? (An attempted survey)
Lee Corbin
lcorbin at tsoft.com
Wed May 24 04:08:01 UTC 2006
Samantha writes
> Also, I have a suspicion that any sufficiently powerful intelligence
> creates de facto simulations just in the act of mulling over
> scenarios due to the greater level of detail and scope involved. If
> so then simulations naturally arise from the very thoughts and
> daydreams of sufficiently advanced minds.
A good point that should be kept in mind.
> I doubt this world or our current small selves are sufficiently
> interesting for such astronomical numbers of simulated copies to be
> at all likely.
I wish that those who believe that there is a good chance that *we*
are living in a simulation would remember this!
> [Robert wrote]
> > to run billions and billions of RBs (some of whom are bound to
> > suffer horribly and die) then some holier than I people are going
> > to be bound and determined to prevent me from doing that. The only
> > way that seems possible is to either (a) imprison me on the Earth
> > (or even in a very high security prison) or (b) take the nanobots
> > to my mind and force the removal of such "immoral" thoughts ( i.e.
> > state or equivalent "mind control").
>
> Who says it is immoral? Silly maybe, but immoral?
Oh, I can easily see why folks might say it's immoral. Consider
even the case where mere rumination of an extremely advanced
being---as you wrote before---amounts to horrible torture of
a human-equivalent sophont. It's *real* easy to say that such
thoughts on the part of such an advanced entity are not moral.
In fact, I concur, unless it can be shown that this vast
creature gets such overwhelming benefit from the activity
that it becomes worth it, globally.
> I would rather risk suffering and dying, even countless times, in
> some lunatic posthuman hell than allow some sophonts such unlimited
> power to forbid absolutely anything and everything that might be used
> somehow, sometime in a way they don't like. That would be worse
> than hell for me.
Sure that you are not exaggerating? Provided that I fell in
to the clutches of a rather benevolent AI, life could still be
very full of almost everything I value. True, I couldn't engage
in "too accurate" historical simulations and a few other banned
activities, but I'd greatly prefer it to your "suffering and
and dying countless times in a posthuman hell".
Lee
More information about the extropy-chat
mailing list