[extropy-chat] silent night
Brett Paatsch
bpaatsch at bigpond.net.au
Thu Dec 23 04:45:54 UTC 2004
Eliezer Yudkowsky wrote:
> Hal Finney wrote:
>> How the heck can you guys say that there is as much as one chance in
>> ten thousand that the sun won't rise tomorrow? The sun has after all
>> risen for much more than 10,000 days. That's like 30 years' worth.
> That was before people started playing around with AI. 99.99% would
> correspond to a 50% chance of a rogue AI disassembling the Sun in the next
> 20 years, with the probability distributed evenly over time (Poisson
> process).
This is an excellent example of the sort of conviction I'd like to bet
against
using real money. If Eliezers claim and line of reasoning is rational and he
had money to bet with me (say US$1000) I'd proceed to formalising a bet
with him that would either allow him to take my money to do with as he
sees fit or me to take his to do with as I see fit.
Say Eliezer was right, not only would I pay him, I'd be greatful for the
education he'd given me and much more aware of the nature of existential
risks I face.
Say I was right, then I might say a good, bright guy like Eliezer from
wasting
a huge chunk of his life. He'd be out US$1000 but he's a young guy and
a genuine truthseeker he could find plenty of other things to do with even
only another four score years and ten.
But where I stand in relation to Eliezer at present, from my point of view,
is that he is making claims that are extraordinary and for me to properly
evaluate them I'd have to spend a lot of time.
So we have an impass. We in practice I assume that Eliezer is wrong
because its too much of an investment of time for me to find out if he's
right.
Eliezer can cope with my disbelief easily. Others believe him. But he
can't teach me and get some of my money because I have the same
aversion to wasting time as he does.
A for-real money futures market would change that dynamic.
Brett Paatsch.
More information about the extropy-chat
mailing list