[ExI] Uncertainty (Was: Organizations to "Speed Up" Creation of AGI?)

Anders Sandberg anders at aleph.se
Sun Dec 25 22:16:42 UTC 2011


Kevin G Haskell wrote:
> I give reasons for
> >why I think we might be in the range 0.1-1% risk of global disaster per
> >year. I urge a great deal of caution and intellectual humility."
>
> No offense intended by this, but isn't the idea of putting any sort of 
> percentage range on something like this actually mutually exclusive 
> from being intellectually humble? Isn't it a bit like the Doomsday 
> clock which has been completely false since it's inception decades ago?

What would you bet on the stock market going up 100% on the next trading 
day? Or that military action will break out against Iran over the next 
year? Or that you will unexpectedly meet an old school friend next week? 
These possibilities are real, and it makes sense to assign a probability 
to them - it is just that it cannot be a neat and certain number like 
when you calculate the chance of getting a royal flush from a random 
poker hand. These probabilities are subjective states based on our 
varying ignorance of the world.

Yet merely saying that one is very unlikely and another one is likely is 
not enough for real decision making - when deciding how much to invest 
or where to deploy troops you actually use a plausibility measure. In 
order to make consistent decisions and avoid being Dutch-booked you need 
sensible plausibility update rules, and that leads to the Bayesian 
formulas (see Jaynes, Probability Theory: the Logic of Science, chapter 
2 for a treatment of the Cox theorems, 
http://bayes.wustl.edu/etj/prob/book.pdf ) So if you try to be rational, 
you will go around with subjective probability estimates that you try to 
update as well as possible based on all the seen evidence and Bayes 
rule, but still weighed by your initial prior probability estimate. This 
estimate might be expressed as a sharp number between 0 and 1, but it 
does represent uncertainty, not certainty. (of course, it would be 
better to express it as a probability distribution - that gives more 
information)

So, getting back to my possible non-humble estimates: they are my 
current estimates. They will change as more evidence accrues - for 
example you giving me a good argument for why they are too low or too 
high, what I read, changes in world politics, what have you. So when I 
say "0.1-1%", do not take this as a claim that some true probability in 
the world has to be in this interval, but that this is what I currently 
take to be the most rational estimate.

The atomic clock is a bit different. I don't think it is intended as an 
epistemic signal (telling exactly how bad the situation is), but as a 
rhetorical device showing concern. It contains some evidence, but while 
it does correlate to some extent to what most would agree the threat 
levels were, it also missed big things like the Cuban missile crisis: 
http://en.wikipedia.org/wiki/Doomsday_Clock#Time_changes

>
> >(That sentence structure is worthy Immanuel Kant :-) )
>
> Depending on what you think of Kant's writing, should I consider that 
> a compliment or insult? ;)

See it as evidence of having far more linguistic working memory than 
most readers :-)

-- 
Anders Sandberg,
Future of Humanity Institute
Philosophy Faculty of Oxford University 




More information about the extropy-chat mailing list