[extropy-chat] Singularity econimic tradeoffs (was: MARS: Because it is hard)

Dan Clemmensen dgc at cox.net
Fri Apr 16 11:54:47 UTC 2004


Eugen Leitl wrote:

>On Thu, Apr 15, 2004 at 04:08:47PM -0400, Dan Clemmensen wrote:
>
>  
>
>>Unless someone can develop and implement a way to actively stop it, the 
>>Singularity will occur some time between now and 2020 with very high 
>>probability. Do you think you will get manned spaceflight at a level 
>>    
>>
>
>While I sympathize with the general sentiment, putting 100% probability
>within 16 years is based on no solid data. Singularity takes both molecular
>circuitry and AI to occur to allow such magic postulates (within XY years,
>while XY<<100). While people work on both, I don't see how one can make
>definite predictions about future inventions.
>  
>
A singularity driven by computer power and software does not depend on any
particular hardware improvement such as molecular circuitry, or any 
particular
software technology such as AI (except in the broadest sense.)

I cannot make legitimate definite predictions, but I (and many others) 
can try to
make educated guesses based on trends. There is an aphorism in marketing:
"the only trend you can count on is demographics." Well guess what? Moore's
law as been as consistent as demographics over at least the last 50 years.

Incidentally, I can and have made definite predictions. I agree that I 
cannot
make legitimate predictions :-)  Eight years ago I predicted the singularity
within ten years.

>  
>
>>that can usefully avoid a "bad" Singularity in that timeframe? 
>>Alternatively, do you believe my estimate of the timeframe is wrong? The 
>>Singularity research is listed as a desperate and probably futile effort 
>>to increase the probability of a "good" outcome. Low cost, low 
>>probability, extremely high payback.
>>    
>>
>
>Of course, if any success is high probability of "bad", and Singularity
>research increases probability, the payback might be not that good after all.
>
>  
>

Perhaps I misunderstand you or I was unclear.

Other unrelated research that will lead to faster and cheaper computers 
will be
undertaken anyway, as will software research and deployment that will 
unintentionally
improve the environment that may spontaneously create an SI or that may 
dramatically
simplify the work of an intentional SI developer.

 I think Elizier and whoever wants to help him, or whoever wants to 
start a parallel project
with similar goals, should be funded. If I understand your statement, 
you object to
funding such project because they may awaken the demon. By contrast, I 
think the god or
demon will wake anyway, so research to awaken the god rather than the 
demon is a
good idea.

Being of a sunny and carefree disposition, and having a "belief" that 
reason tends to
"good," I think that the SI will rapidly create a morality for itself 
that I will consider
"good." Therefore, I'm in favor of actively accelerating the advent of 
the SI if possible.





More information about the extropy-chat mailing list