[extropy-chat] Re: SI morality

Chris Phoenix cphoenix at CRNano.org
Mon Apr 19 05:11:33 UTC 2004

Robert J. Bradbury wrote:
 > An argument may be completely rational -- but may be based on false
 > premises.  For example a premise that Jews were granted *all*
 > of Israel/Palestine (by God).

Perhaps a comparison with formal reasoning may help.  Geometry is based 
on axioms.  I think this is the same as your "premises."  These axioms 
are not fixed; they depend on context.  On the surface of a sphere, 
parallel lines meet.

So it's possible to be completely rational in a certain context, and get 
results that don't make any sense outside that context.  If someone 
believes in God and divine miracles, they can quite rationally conclude 
that God could have flooded the world suddenly, 4000 years ago.  In the 
context of science, this makes as much sense as parallel lines meeting.

 >  That is the problem we need to deal
 > with here -- *when* are rational arguments based on invalid premises.
 > Secondarily when one is dealing with invalid premises how does one assert
 > that?

Here, I'll disagree.  The problem is not that premises are valid or 
invalid.  The problem is when results are evaluated in or out of 
context.  Unfortunately, a major human cognitive flaw is to believe that 
the context in their head extends far beyond their heads.  Of course, 
the internal context is necessarily far simpler than the real world. 
I'll get to the real world later.

Note that useful thoughts can be arrived at by either rational or 
non-rational thought.  Rationality is only as good as its premises, and 
for many problems, we simply don't know enough to pick premises.  I'll 
have more to say about this later, too.

 > It seems like it is reasonable for anyone to assert that my premises
 > are as valid as yours unless one can claim higher ground with respect
 > to the superiority of the validation of ones premises.  If one is
 > dealing with people uneducated in these methods of evaluation then
 > such asserations are useless.

It's easy to say that the context inside someone's head is wrong or 
meaningless.  That's dangerous and unsustainable, and probably morally 
wrong by most systems of morality.  And blinds you to the rich value of 
human thought processes.

Everyone's mental context is limited.  Within their heads, their 
premises are fine.  The trouble comes when they try to compare their 
results to a different context.  The results will probably disagree, and 
this will usually reinforce inaccuracy.  It's extremely hard for people 
to change their minds in response to a conflict between what's in their 
minds and what's outside their minds.  They take the easiest path to 
make the conflict go away, which frequently involves warping their 
perceptions or rationalizing a reason to ignore the input.

An even bigger problem comes when the process of denying input leads to 
xenophobia or neophobia.  People may insist they're right to the point 
of trying to coerce the whole world into agreement with what's in their 
heads.  Religious fanatics and scientists alike may go on crusades to 
try to crush ideas they don't like.

 > If I claim that gravity is a physical force that draws things
 > towards greater masses I can drop a tennis ball and demonstrate
 > it for anyone to see.

The idea that, because you can get effects you expect, everyone else 
will see what you see and additionally be forced to believe your 
explanation, is not consistent with human psychology.

Even scientists reject observations all the time.  They read the data 
differently.  They assert that there must be errors in the method.  They 
attack the credentials of the observer.  They change the subject.  They 
build strawman attacks, and frequently even appear to convince 
themselves.  They form cliques.  They tell their students not to even 
read the claims, and certainly not to investigate them.

 >  If I claim that Christ could manage to
 > turn 5 loaves of bread into enough to feed 5000 people (John 6:1-15)
 > without invoking nanotechnology I am on somewhat swampy ground.

Lots of people brought lunches; lots of people didn't.  So you bring a 
kid up onstage and get him to share his lunch.  Everyone starts sharing, 
and it turns out that a lot of people packed more food than they needed. 
  A neat example of leadership and community-building.  The "that's 
mystical" / "that's impossible" conflict can make it surprisingly hard 
to find mundane and obvious explanations for things.  This is an example 
of premature rejection of data that causes discomfort.

 > But most people are willing to accept the premises based on
 > belief or faith rather than on evidence that the premises are
 > valid.

Most people's minds are so self-referential that they can manufacture 
evidence for or against anything.  Especially if someone is telling them 
how to interpret it.  It's a mechanism of human psychology: if you can 
actually get someone to a profoundly confused state, and then you tell 
them something that makes it all make sense, they'll cling to that 
explanation like a life preserver.

For example, get someone to feel unhappy and guilty and confused about 
their life.  Then tell them, "Your unhappiness is caused by lack of 
belief in God."  This makes them feel less confused--and they feel 
better--so it must be right!--so the rest of their confusion goes away. 
  And that's the evidence.  They can feel the healing touch of God, 
soothing all their worries.  In their internal context, this is *real 
evidence*.  From what we know of psychology, we can see how the illusion 
works.  But we can't say they have no evidence--all we can say is that 
they misinterpreted their perceptions.

For another example, show someone a counterintuitive scientific effect, 
then give them the science theory behind it.  They'll believe it.

Or do a magic trick, and give them any plausible-sounding explanation.

Note that until they try to apply or test what they're told, the last 
two examples are functionally indistinguishable.  So don't be so quick 
to condemn people for lack of evidence.  First, most people have real, 
tangible evidence underlying their faith.  Second, most scientific 
explanations accepted by laymen are not based on evidence but on 
interpretation--and not even their own interpretation, but the one they 
were handed by the scientist.

 > With regard to Paul's question.  You have to view the fact that
 > rational argument, behavior, etc. has a greater value than the
 > inverse.

Which inverse?  There are several kinds of non-rational and irrational 
thought.  Intuition; trust in assertions made by others; rationalized 
primitive/emotional inclinations; and so on.  Some of these, in some 
contexts, are reasonable strategies.  After all, what's the 
highest-value thing to do when it's just been demonstrated that you're 
in an unknown context and all your postulates are suspect?  Is it to 
grab the first postulate that comes along and seems to fit?  No: as I 
explained above, that leaves you wide open to religious conversion. 
(Unfortunately, it's often hard for human brains to try out postulates 
while retaining the ability to discard them.  Once it's in, it redefines 
self and non-self, and anything that threatens it is rejected.)  It may 
be better to fall back on non-rational thought until you've had a chance 
to comprehend your new environment and learn postulates that lead to 
sane thinking.

Now I'm ready to talk about the "real world" and about science.  It's 
tempting to think that the world is a single context that everything can 
be compared to.  But this is equivalent to reductionism.  There are lots 
of things in the world that can be understood far more completely by 
approximation than by first principles.  For example, human psychology 
has some really weird phenomena (phobias, optical illusions, 
passive-aggressive behavior, etc) that a study of physics will not help 
you understand.  To a psychoanalyst or a politician, or even a medical 
doctor, a study of shamanism will have more concrete utility than a 
study of electromagnetism.

In fact, when dealing with people, not studying at all--not trying to 
form postulates and practice formal thought, but just going on instinct, 
intuition, and experience--may be more effective.  This is because 
people are incredibly complex, and we have a strong evolved non-rational 
toolset to help us deal with them.  In addition to people, things like 
ecology may still be too complex for rational thought to improve on 
accumulated heuristics, because we simply don't yet know the postulates 
and methods.  And then there are things like immunology and cosmology 
where none of our tools really work yet, so the only way to approach 
them is by study and rationality.  Eventually, we can expect that study 
and rationality will encompass psychology (including religion and 
parapsychology) and ecology and everything else as well.

You mentioned the undesirability of chaos.  The alternative to chaos is 
the belief that a self-consistent real-world context exists.  But even 
though it exists, we can't access it directly.  Science is motivated by 
the desire to build conceptual contexts that map to the real-world one. 
  Its methods include cataloging (an underrated skill these days), 
categorization, experiment, creativity, criticism, and more.  In some 
sub-contexts like electromagnetism, scientists have been very 
successful; the mapping is very close.  In protein folding, the end is 
in sight.  Pedagogy, psychology, and oncology are quagmires, though 
oncology may be ready for a synthesis.

But back to the practice of science: the trouble is that scientists, 
like everyone else, are prone to the illusion that their chosen context 
extends everywhere.  Let's be clear: I don't mean that scientists should 
leave room for the paranormal or magical.  They should not.  I mean that 
chemists should leave room for physics, and physicists should leave room 
for psychology, and psychologists should leave room for chemistry. 
Otherwise you get absurdities like chemists declaring that Drexler's 
physics and mechanics work is worthless, when it's obvious they don't 
even understand it.

One thing I never see addressed in discussions of rationality: How does 
a rational thinker know when to keep their ears open and their mouth 
shut?  Obviously, the belief that a rational thinker will be an expert 
in everything is irrational.  But it's far too common.  Scientists are 
slowly learning enough to be rational in certain limited contexts.  And 
in a few glorious areas, those contexts have spread enough to merge. 
But anyone who aspires to rationality should learn from the 
overconfidence of scientists who, secure in their rationality, talk 
nonsense outside their field.  That's as big a mistake--I would argue 
that it's the same mistake--as religious people talking nonsense while 
feeling secure in their irrationality.  The mistake is assuming that 
their mental context extends farther than it actually does.

And scientists and rationalists have even less excuse than 
irrationalists.  If as great a scientist as Lord Kelvin could be wrong 
about something as mundane and technical as heavier-than-air flight, 
surely the rest of us should be extremely cautious when talking outside 
our field of study--or even inside it, for many fields.  But no, we keep 
making the same mistake: our context defines our universe, and 
everything we see must be made to conform.  Appeals to rational thought, 
in the end, are usually just another way to rationalize this process.


Chris Phoenix                                  cphoenix at CRNano.org
Director of Research
Center for Responsible Nanotechnology          http://CRNano.org

More information about the extropy-chat mailing list