[extropy-chat] Science and Fools

Eliezer S. Yudkowsky sentience at pobox.com
Mon Mar 21 23:24:31 UTC 2005


Robin Hanson wrote:
> At 01:28 AM 3/21/2005, Hal Finney wrote:
> 
>> That would extend my rule about not disagreeing with the scientific 
>> consensus, to not disagreeing with the consensus in other fields. 
>> Isn't there a danger that this broader view is more likely to run into 
>> the situation where different fields have very different opinions 
>> about some common subject matter?  Religion vs biology on evolution, 
>> liberalism vs conservatism on politics?
>> Are you saying that there should be no difference in how you weight
>> the information about consensus in a field of study, based on how
>> much progress the field has made, and how accurate it has been in
>> the past?  Or how would you incorporate that kind of information?
> 
> When considering whether to disagree with someone, you must try to infer 
> their information, analysis, and rationality relative to you.   There 
> are many things that might give you clues about these things, but most 
> of these clues are rather weak.  The recent rate of progress in a field 
> is a rather weak clue about the analytical ability and rationality of 
> the people in the field.  When the field has a consensus, and you are 
> considering disagreeing with it, the evidence is usually strong that 
> they have a lot more information than you.  So in this situation it is 
> hard to see how the rate of progress would make that much difference.

But here's an interesting question:  *Why* do some fields accumulate new 
agreed-upon information (and perhaps technological applications) much 
more rapidly than other fields?  Supposing, and it is a large 
supposition, that these fields are accumulating better approximations to 
the truth, what is it that some fields are doing right, and others doing 
wrong?  For if I can fathom this strange difference, this magic quality, 
this mark of Reason, then I can wield that power directly to determine 
which ideas I ought to accumulate in myself.  And I will be that much 
closer to the question therefore.  Of course, to wield this power may 
require that I acquire some technical knowledge of the direct matter of 
interest, and that knowledge may not be cheap.  But what the hell.  At 
least there's the theoretical possibility of making progress that way. 
I mean, what makes you think you can arrive to the correct answer, no 
matter how much you scrutinize the other guy's psychology, if you don't 
have the technical knowledge?  What makes you think that a secondhand 
nontechnical answer is even useful?

Maybe to distinguish truth from falsehood you need to do technical 
thinking about the direct matter of interest, and if you can't perform 
that procedure you're screwed, just screwed, there *is* no way to tell 
from the outside.  I do not say that this is always so.  I do not even 
say that it is usually so.  But it is a possibility that I try to bear 
in mind.  And this thought encourages me to bite deeper into the meat of 
problems.

Once upon a time, physicists told me that light was made of waves; and 
being one who draws upon the scientific consensus, I had faith that this 
was so, although I did not know how to prove it for myself.  And then, 
years later, I was reading the Feynman Lectures, and I came across a gem 
called "the wave equation".  I thought about that equation for three 
days, until I saw to my satisfaction that it was stupidly obvious.  And 
when I understood, I realized that my faith that light and sound and 
matter were "waves" had been utterly useless.  The physicists had not 
lied to me.  But I then had no idea what the word "wave" meant to a 
physicist.  How much good does it *really* do to borrow knowledge whose 
truth you can't judge for yourself?  Even if your mastery of psychology 
enabled you to know with certainty that a physicist is rational when 
saying "Sound is waves", what have you actually learned?  You have 
learned to associate one syllable with another.

-- 
Eliezer S. Yudkowsky                          http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



More information about the extropy-chat mailing list