[extropy-chat] Bluff and the Darwin award

Eliezer S. Yudkowsky sentience at pobox.com
Tue May 16 17:47:52 UTC 2006


Russell Wallace wrote:
> 
> The moral of this story is that, puffer fish notwithstanding, making 
> yourself appear more dangerous than you are is not always a wise strategy.
[...]
> The Singularity is a lovely idea. (Bad word for it, mind you - misuse of 
> the mathematical terminology - but unfortunately we seem to be stuck 
> with it now.) In the works of E.E. Smith and Olaf Stapledon, Asimov and 
> Clarke, it provided inspiring visions of possible futures; and while any 
> particular vision is unrealistic, the general concept that our remote 
> descendants may be greater than we are, is a good and reasonable one.

Since you offer us no reason to believe that a Singularity is ruled out 
in the particular timeframe 2006-2026 (or whatever it is you believe is 
ruled out), your entire polemic reduces to the following statement:

"Assuming that no hard takeoff occurs between 2006-2026, it would be 
very wise to believe that no hard takeoff will occur between 2006-2026, 
and foolish to believe that a hard takeoff will occur between 2006-2026."

It is rather reminiscent of someone lecturing me on how, if I don't 
believe in Christ, Christ will damn me to hell.  But Christians have at 
least the excuse of being around numerous other people who all believe 
exactly the same thing, so that they are no longer capable of noticing 
the dependency on their assumptions, or of properly comprehending that 
another might share their assumptions.

What's your excuse?  A majority of Extropian readers, while they may not 
believe in an imminent hard takeoff, don't regard their available 
information as ruling it out to the extent you believe it is ruled out. 
  You, presumably, know this on some level.  So what was the point of 
your polemic?

Dear Christian, you don't need to convince me that if Christ were God, 
it would be good for me to believe the assertion that Christ is God; you 
don't even need to threaten me with eternal damnation; it follows 
directly from a moral principle I have acknowledged, called 
"rationality".  Supposing snow to be white, I want to believe the 
assertion "snow is white".

And you, Russell, need not list any of the negative consequences of 
believing in a hard takeoff when it doesn't happen, to convince me that 
it would be well not to believe in it supposing it doesn't happen.

-- 
Eliezer S. Yudkowsky                          http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



More information about the extropy-chat mailing list