[extropy-chat] Peak Oil meta-news

Eliezer S. Yudkowsky sentience at pobox.com
Wed Mar 8 19:07:06 UTC 2006


Robin Hanson wrote:
> At 08:10 AM 3/8/2006, Lee Corbin wrote:
> 
>>I venture to say that anyone who's studied Gold's fine book
>>will significantly change his odds more towards the abiotic
>>theory, (even if he retains probability at less than
>>fifty percent).
> 
> This can only be true on average for rational people if they do not believe
> your claim.   If I believed you that my odds would go up after reading the
> book, I would just raise my odds in anticipation of that, even if I never read
> the book.   If I am rational, I must expect that reading a book arguing for
> a position on some subject will be as likely to move me away from that
> position as to move me closer.  (More precisely my expected value of
> my future expectation must equal my current expectation.)

Robin, this is true, but I fear the way you phrased it may confuse 
people.  They may visualize picking up Gold's book, on Lee's 
recommendation, and being forced to believe that Gold's book, which 
contains many arguments *for* abiotic oil, must necessarily have a 
significant probability of leaving them *less* convinced of abiotic oil 
than they were before they ever heard of Gold's book.

What actually happens is this:

1)  I assign some small probabiility to abiotic oil, say, 20%.
2)  I hear Lee recommending Gold's book.
3)  I now assign some probability, say 30%, to Lee's assertion that if I 
read Gold's book, I would assign a greater probability to abiotic oil, 
such as 80%.
3a) This involves taking the probability of a probability, which 
involves thorny issues of reflectivity which I'm still trying to work 
through, such as empathizing with your future self and granting credence 
to a purely abstract computation.
3b) If you permit the notion of a probability of a probability, it is 
clear that the variance in my expected future probability assignment has 
increased.
4) The actual distribution over my future probability assignment now has 
two spikes; a spike at 20% and a spike at 80%.  My *expectation* will 
lie somewhere between these two points.
5)  I read Gold's book and find it has no convincing favorable 
arguments.  My opinion is now unchanged relative to what it was at the 
start of the analysis, 20%.  This was always my dominant opinion, and 
has not changed; however my *net expectation* briefly rose and then 
settled back down.

Therefore:

6)  There is still net information value to reading Gold's book, even 
after taking into account Lee's recommendation of it.
7)  When it is said that I must assign a balanced expectation to Gold's 
book increasing or decreasing my probability estimate of abiotic oil, 
this balance is estimated relative to my state of uncertainty as to 
whether Gold might have any good arguments, *not* relative to my 
pre-analysis state of having never heard of Gold.
7a) Gold's book, taken as an isolated artifact, does not necessarily 
have a balanced expected effect on a rational reader's opinions for or 
against prebiotic oil.
7b) For example, suppose that I hand Gold's book to someone after 
stripping off the cover, so the reader has no idea what the book is 
about.  It is quite reasonable for me to estimate an unbalanced 
expectation that Gold's book will shift their opinions in favor of 
prebiotic oil, rather than the converse.
7c) This does not reflect unfavorably on the rationality of the reader. 
  Aumann's Agreement Theorem assumes common knowledge.  I know what the 
book is about, but the person who is about to read it does not.

Good rationalists are likely to keep separate buckets for evidence that 
is strictly from observed facts, and evidence that is from others' 
observed opinions.  I think this is a wise policy in practice, though in 
theory other people are just another kind of observable fact.  Gold's 
book does not necessarily have a balanced expected impact on your 
fact-bucket - only on your combined bucket that includes your weighting 
over all observed facts *plus* all observed opinions.

-- 
Eliezer S. Yudkowsky                          http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



More information about the extropy-chat mailing list