[extropy-chat] Moral Truths (was Collective Singularities)

Anders Sandberg asa at nada.kth.se
Mon Jun 5 10:05:27 UTC 2006


Lee Corbin wrote:
> I assume that I'm correctly assuming that we have no inkling of any level
> level of computability beyond the most general one we are familiar with,
> and quite a bit of circumstantial evidence to suppose that no such level
> exists.

Actually, there is a whole menagerie of hypercomputing. It is just that we
don't know how (or if we can) do it:
http://www.amirrorclear.net/academic/papers/many-forms.pdf
http://arxiv.org/ftp/math/papers/0209/0209332.pdf

>> From an ethical standpoint this is relevant, since if there is nothing
>> above us then  if moral is something that can be discovered or deduced
>> all posthumans will be equivalent in potential understanding of
>> morality.
>
> Do you in fact believe that what is moral can be deduced or discovered?
> If so, why?  I must say that to me, the entire notion of evolutionary-
> independent morality is extremely dubious.

Well, on alternate weeks I am positive about this, otherwise I doubt it.
Obviously some aspects of morality are evolution independent like the
demand that it be self-consistent (an inconsistent morality leads to
dilemma situations; an otherwise similar morality without dilemmas would
at least be more efficient to apply). However, I think most of these
aspects are fairly trivial (others disagree).

If morality is derived from evolution then we need practical intelligence
to discover and apply it, since clearly just relying on evolved moral
intuitions will be problematic when we get outside of our environment of
adaptation.

>> But maybe there are moral truths or decisions that can only be reached
>> using quantum computing?
>
> Well, until you persuade me that moral truths exist at all, this will
> continue to sound pretty silly  :-)

Suppose we speak about "practical moral truths" in the sense of "if you do
this, things will usually be good". One can see them as policy functions
in reinforcement learning that maximize some utility function for you
(i.e. we leave out the issue of what that utility ought to be, which is
what most moral philosophers would likely consider real morality). Some
"truths" of this kind are heuristics of how to handle prisoners dilemma
situations, whether to cheat or not, or how one ought to deal with unjust
government. Clearly finding answers to such questions (even when merely
assuming a particular utility function) can be arbitrarily complex. Hence
there are likely practical moral truths that we cannot deduce using mere
Turing computation but actually would need something more powerful to get
at.

(The idea of an "inner voice" in Christianity and Socrates is amusingly
like Turing-Oracle machines, where there is a black box that gives
hyperturing help)


-- 
Anders Sandberg,
Oxford Uehiro Centre for Practical Ethics
Philosophy Faculty of Oxford University





More information about the extropy-chat mailing list