[extropy-chat] Re: Overconfidence and meta-rationality

Robin Hanson rhanson at gmu.edu
Mon Mar 21 16:35:38 UTC 2005


On AM 3/13/2005, Eliezer S. Yudkowsky wrote:
>>>It happens every time a scientific illiterate argues with a scientific 
>>>literate about natural selection.  ...  How does the scientific literate 
>>>guess that he is in the right, when he ... is also aware of studies of 
>>>human ... biases toward self-overestimation of relative competence? ... 
>>>I try to estimate my rationality in detail, instead of using unchanged 
>>>my mean estimate for the rationality of an average human.  And maybe an 
>>>average person who tries to do that will fail pathetically.  Doesn't 
>>>mean *I'll* fail, cuz, let's face it, I'm a better-than-average rationalist.
>>You claim to look in detail, but in this conversation on this the key 
>>point you continue to be content to just cite the existence of a few 
>>extreme examples, though you write volumes on various digressions.  This 
>>is what I meant when I said that you don't seem very interested in formal 
>>analysis.
>
>I don't regard this as the key point.  If you regard it as the key point, 
>then this is my reply: while there are risks in not foreshortening the 
>chain of logic, I think that foreshortening the reasoning places an upper 
>bound on predictive power and that there exist alternate strategies which 
>exceed the upper bound, even after the human biases are taken into account.

To repeat: I am *not* suggesting that your foreshorten or ignore 
anything!  I am suggesting that you might pay more attention to a certain 
big clue.

>To sum up my reply, I think I can generate an estimate of my rationality 
>that is predictively better than the estimate I would get by substituting 
>unchanged my judgment of the average human rationality on the present 
>planet Earth, even taking into account the known biases that have been 
>discovered to affect self-estimates of rationality.  And this explains my 
>persistent disagreement with that majority of the population which 
>believes in God - how do you justify this disagreement for yourself?

It is far from enough for you to be better than the average human.  Anytime 
you disagree with someone, you have to ask yourself how you compare to 
*that* person.  And since even the very best people by most any metric 
disagree a whole awful lot, clearly even the very best are making some 
serious mistakes in estimating their relative rationality.  So you have to 
ask yourself why you think you are doing better than *them*.  And ask in 
some detail.

>>Maybe there are some extreme situations where it is "obvious" that one 
>>side is right and the other is a fool.
>
>How do these extreme situations fit into what you seem to feel is a 
>mathematical result requiring agreement?  The more so, as, measuring over 
>Earth's present population, most cases of "obviousness" will be 
>wrong.  Most people think God obviously exists.

Imagine you came upon a brick wall with the following words painted on 
it:  "I am not a brick wall!  I am a rational conscious being who hears 
everything you say here, and could at anytime choose to change these words 
here.  I know all about the theory of disagreement.  And I think you, the 
guy there reading me, are an idiot." You hit the wall with a hammer and 
paint and brick chips fly - looks just like paint on a brick wall to you.

*Of course* you should feel free to disagree with this brick wall.  The 
clues are overwhelming here that this wall is an idiot.  Some person may 
have once written those words, and had reasons for them, but the wall is 
clearly *not* listening to your arguments.  Humans are biased to reject the 
views of those who call them idiots, but to make you misread these clues in 
a case like this the bias would have fantastically strong, must stronger 
than we need to explain ordinary human arrogance.

But our ability to point to extreme cases like this does *not* license the 
ordinary range of human disagreement.

>If you're asking after specifics, then I'd have to start describing the 
>art of specific cases, and that would be a long answer.

Yes, I was asking after specifics.  If no time now for such an answer, that 
is fine.

>>If so, what indicators are you using there, and what evidence is there to 
>>support them?
>
>When I disagree with an 'educated' person, it may be because I feel the 
>other person to be ignorant of specific known results; overreaching his 
>domain competence into an external domain; affected by wishful thinking; 
>affected by political ideology; educated but not very bright; a 
>well-meaning but incompetent rationalist; or any number of reasons.  Why 
>are the specific cues important to this argument?  You seem to be arguing 
>that there are mathematical results which a priori rule out the usefulness 
>of this digression.

I think that impression is wrong.  But it is also not enough to have a long 
list of possible mistakes you think the other person might have made.  Most 
everyone explains their disagreements in terms of mistakes they think other 
have made.  But people, even very smart people, are clearly are biased in 
underestimating their own chances of making such mistakes.  So the big key 
question is how you can avoid such biases in your own estimates.

>>A formal Bayesian analysis of such an indicator would be to construct a 
>>likelihood and a prior, find some data, and then do the math.  It is not 
>>enough to just throw out the possibility of various indicators being useful.
>
>I lack the cognitive resources for a formal Bayesian analysis, but my best 
>guess is that I can do better with informal analysis than with no 
>analysis.  As the Way renounces consistency for its own sake, so do I 
>renounce formality, save in the service of arriving to the correct answer.

How about an informal, but explicit, Bayesian analysis?


Robin Hanson  rhanson at gmu.edu  http://hanson.gmu.edu
Assistant Professor of Economics, George Mason University
MSN 1D3, Carow Hall, Fairfax VA 22030-4444
703-993-2326  FAX: 703-993-2323  





More information about the extropy-chat mailing list