[extropy-chat] Are ancestor simulations immoral? (An Attempted Survey)
Lee Corbin
lcorbin at tsoft.com
Fri May 26 00:12:20 UTC 2006
Jeffrey replies to Samantha, but asks questions too delicious for
me to pass up!
> Samantha:
> > "This is a straw man that was not advocated."
> I didn't mean to claim that it was advocated, I was making my case.
Well, Jeffrey, I have to add that I had the same impression that
Samantha did. In my words, you did seem to be arguing against a
position that I don't think anyone on this list takes.
> Let me ask you a question:
> I assume that we agree that a "real" being and a conscious
> "simulated" being are both composed of hardware and software
> and that both exist at the "real" layer of "reality". Why
> should ending the life of a "simulated" being, be viewed
> any differently than a "real" being murdering another "real"
> being?
Here is a case of the same thing, I think. I don't believe that
anyone at all on this list would say that there is a difference.
> Why should torturing a conscious "simulated" being, be viewed
> differently than a "real" being torturing another "real" being?
> The crimes are equivalent.
Likewise---if you mean "moral crime".
> The only "factor" that would supposedly separate the status
> of a "real" being from the status of a "simulated" being is
> that the "real" one was born first and therefore supposedly
> deserves to wield ultimate power over the one that was born
> later.
It all depends. I do believe that you are ignoring differences
that are essential to me, and perhaps to the others. I suspect
that the core of the actual difference between our viewpoints
can be seen in your next statement:
> That's "messed up"; it's legally allowing murder and torture.
Bringing "The Law" into it is an *entirely* different can of
worms. Surely you don't believe that everything that is bad
should be outlawed. Or do you?
Almost *all* of our progress and all of the humanitarian
improvements in the human condition the last ten thousand
years have stemmed from (1) Rule of Law, and (2) Protection
of Private Property. It is extremely hazardous, in my
opinion, not to treat these two principles with the utmost
respect.
As the 20th century showed, there is practically no limit
to the harm that results from tampering with these principles,
tampering that is always accompanied, of course, by the best
of intentions.
As Samantha said, it may be non-trivial to determine the
extent of consciousness of a character in a video game.
I suspect that the difficulty can be arbitrarily great;
i.e., it could be arbitrarily high up in the complexity
classes, beyond NP-complete.
So we come to the classic question: Who is to decide?
The totalitarian answer is that all power should be in the
hands of the people, i.e., in the hands of their elected
or non-elected representatives. In other words, the government
must decide what actions you take are ethical and moral, and
which are not.
But the evolved solution, namely (1) Rule of Law, and (2)
Protection of Private Property, is far less ambitious. It
recoils from the idea that wisdom can be concentrated in a
single place (e.g. the Supreme Court) or anywhere in fact,
that is not *local*.
Thomas Sowell explains all this with fantastic clarity in
his books, such as "Decisions and Knowledge". The greatest
thinkers of the past, e.g. Von Mieses and Hayek, were perhaps
the first to deeply understand, but an "in practice" understanding
was also achieved by America's founders, who legislated tremendous
*restrictions* on what higher bodies could do to lower ones.
Therefore the instinctive recoil of people like Samantha, who
are extraordinary leery of having a body intercede in her affairs
to determine the complexity of her software and whether or not
she's doing the "right" thing with it is well-founded.
I do not happen to have such a strong aversion as many like
she do; but still, I can fully understand on *principle* that
it is best for someone to be able to determine what happens
inside their own minds and inside their own property, without
an outside regulator poking around.
It seems that if I were get a software program that created
and destroyed a sentient every microsecond, then after about
a minute you would consider me the greatest mass-murderer of
all time. Is that true?
You should maybe think of what's bugging you this way: what
are the odds that if you grant someone freedom they'll
immediately conjure up a hell and conjure up millions of
sophonts to agonize in it?
Lee
More information about the extropy-chat
mailing list