[extropy-chat] Experts suck
Hal Finney
hal at finney.org
Thu Dec 22 04:08:14 UTC 2005
I ran across a very nice review of what sounds like an excellent book
at <http://www.newyorker.com/critics/books/articles/051205crbo_books1>.
The book is Philip Tetlock's "Expert Political Judgment: How Good Is
It? How Can We Know?" available from
<http://www.amazon.com/gp/product/0691123020/qid=1135215537/sr=2-1/ref=pd_bbs_b_2_1/103-1198668-85022
46?s=books&v=glance&n=283155>.
Based on the review, this book powerfully debunks the ability of experts
to predict trends any better than ordinary people, or indeed any better
than monkeys. Tetlock conducted a long term study asking experts to
make predictions and keeping track of the results. From the review:
> Tetlock got a statistical handle on his task by putting most of
> the forecasting questions into a "three possible futures" form. The
> respondents were asked to rate the probability of three alternative
> outcomes: the persistence of the status quo, more of something (political
> freedom, economic growth), or less of something (repression, recession)...
>
> The results were unimpressive..., the experts performed worse than they
> would have if they had simply assigned an equal probability to all three
> outcomes--if they had given each possible future a thirty-three-per-cent
> chance of occurring. Human beings who spend their lives studying the state
> of the world, in other words, are poorer forecasters than dart-throwing
> monkeys, who would have distributed their picks evenly over the three
> choices.
>
> Tetlock also found that specialists are not significantly more reliable
> than non-specialists in guessing what is going to happen in the region
> they study... "In this age of academic hyperspecialization, there is no
> reason for supposing that contributors to top journals--distinguished
> political scientists, area study specialists, economists, and so on--are
> any better than journalists or attentive readers of the New York Times in
> `reading' emerging situations." And the more famous the forecaster the
> more overblown the forecasts...
>
> People who are not experts in the psychology of expertise are likely
> (I predict) to find Tetlock's results a surprise and a matter
> for concern. For psychologists, though, nothing could be less
> surprising. "Expert Political Judgment" is just one of more than a
> hundred studies that have pitted experts against statistical or actuarial
> formulas, and in almost all of those studies the people either do no
> better than the formulas or do worse...
>
> The experts' trouble in Tetlock's study is exactly the trouble that
> all human beings have: we fall in love with our hunches, and we really,
> really hate to be wrong...
>
> Tetlock's experts were also no different from the rest of us when it
> came to learning from their mistakes. Most people tend to dismiss new
> information that doesn't fit with what they already believe. Tetlock
> found that his experts used a double standard: they were much tougher
> in assessing the validity of information that undercut their theory
> than they were in crediting information that supported it. The same
> deficiency leads liberals to read only The Nation and conservatives to
> read only National Review.
I recommend reading the rest of the review, it has many more good points
to make. The only part I would take exception to is the last sentence!
> But the best lesson of Tetlock's book may be the one that he seems
> most reluctant to draw: Think for yourself.
No! Never think for yourself! What do you think all those experts
were doing? They were all thinking for themselves! And look how they
screwed up. That's a sure path to all of the cognitive mistakes and
biases which Tetlock apparently does such a great job of analyzing
and dissecting. It sounds like Tetlock himself does not succumb to
the think-for-yourself mantra, knowing what a mistake that advice is,
but the reviewer didn't understand the point.
Those who have followed my writing will recognize how much I have been
influenced by results like the ones described in this review, thanks
largely to the contributions by Robin Hanson, Eliezer Yudkowsky, and
others who have posted here. IMO studying cognitive biases and errors is
a great way to put your own analytical skills into perspective. (As long
as you don't read while thinking, of course this doesn't apply to me!)
Hal
More information about the extropy-chat
mailing list