[extropy-chat] Re: Overconfidence and meta-rationality

Hal Finney hal at finney.org
Mon Mar 21 20:40:24 UTC 2005


Russell Wallace writes:
> On Mon, 21 Mar 2005 10:44:37 -0800 (PST), "Hal Finney" <hal at finney.org> wrote:
> > You're both
> > born into the same world, products of the same evolutionary process;
> > you're exposed to different information, and ultimately your estimations
> > of probabilities are based on these causal factors.  Do you accept that
> > if you had been exposed to the experiences Elizer had, you would have
> > come up with his estimation of the probabilities, rather than yours?
>
> No. Our beliefs on things that aren't matters of proven fact depend
> not only on the domain-specific information we've received, but also
> on our genes and general experiences that form our personalities.

What about the thought experiment I offered to Eliezer: suppose you had
a duplicate made in a magical duplicating machine, and then the two
of you went off and studied some question for a year, independently.
Perhaps it's the one you disagreed upon with Eliezer.  When you return,
you each simultaneously announce your estimates of the probability for
the matter in question.

To make it dramatic, suppose that one of you decides that the probability
is low, say 1/10; while the other copy decides that the probability
is high, 9/10.  If you put yourself in that situation, would you find
that the mere knowledge that your copy got such a different result would
cause you to drastically revise your own opinion about the probability?
After all, you might as easily have been him, and if you had been
subjected to the same information as he was, you would presumably have
come up with essentially the same result.  In a symmetric situation like
this, doesn't it force you to view both probability estimates on equal
grounds, rather than favoring the one that you happen to hold?

Let's further impose the rather artificial restriction that all you can
do now is exchange your (updated) probability estimates with each other.
As you listen to the other copy's estimates your own probability estimates
may change, and you might expect his estimates to change as well.

What do you think about the claim that in this situation, it is
impossible, if you both view each other as honest, rational truth-seekers,
for you two to "agree to disagree"?  That you will both eventually
converge on a common estimate of the probability, even if you are not
allowed to explain the reasoning and information that led you to your
opinions?  Mere knowledge of the disagreement is enough to eliminate it,
according to the theorem.

By focusing on this rather artificial example, I am trying to eliminate
the variable you cited, the variation in genetics and prior experiences.
In this case, there will be variation due to experience over the course
of the experiment, but you will clearly start with the same set of biases
and beliefs ("priors", in Bayesian terms).  I wonder if you would then
find the no-disagreement argument persuasive.

Hal



More information about the extropy-chat mailing list