[extropy-chat] Singularity estimates vs. research funding

Adrian Tymes wingcat at pacbell.net
Mon Apr 26 18:21:51 UTC 2004


--- Dan Clemmensen <dgc at cox.net> wrote:
> Adrian Tymes wrote:
> >--- Dan Clemmensen <dgc at cox.net> wrote:
> >>Projects to kill:
> >>    manned spaceflight
> >>
> >>Projects to fund:
> >>     Singularity research
> >
> >You know, of course, some of us are looking to
> manned
> >spaceflight in case the Singularity turns out
> >undesirably (to us as we are at the time).  Similar
> >arguments can be made about the other projects on
> your
> >kill list (with the possible exception of fusion,
> at
> >least anything involving fusion reactors larger
> than a
> >standard workbench).
> >
> Unless someone can develop and implement a way to
> actively stop it, the 
> Singularity will occur some time between now and
> 2020 with very high 
> probability. Do you think you will get manned
> spaceflight at a level 
> that can usefully avoid a "bad" Singularity in that
> timeframe? 

No, but I think it might start to have enough of an
impact by 2050.

> Alternatively, do you believe my estimate of the
> timeframe is wrong?

Yes.  Given the state of the art today, I place a very
low probability of software that we, today, would call
"sentient" in operation by 2020 - to say nothing of
said software specifically being put to the task of
optimizing itself*, which event seems to either be or
be a prerequisite for the Singularity by most
definitions.  By 2050 has higher odds, of course.

* This requires having the software, and having its
controller sufficiently foresighted to allow or
direct it to do so.  This could happen practically
instantaneously, of course, but it could also take
years or decades to happen.  Not all sentient entities
always make the most optimal decisions.  Proper
planning takes the case that they do as a corner case,
of course, but it also recognizes what happens in
other situations.  Sometimes, a solution that works
for all the extreme cases fails miserably if one
assumes more average outcomes...and in practice, one
faces the average and mundane more often than one
faces the extreme.

> The 
> Singularity research is listed as a desperate and
> probably futile effort 
> to increase the probability of a "good" outcome. Low
> cost, low 
> probability, extremely high payback.

And it may have some other, more immediate paybacks
too.  But I believe the cost is low enough that one
need not cut all that you listed to properly fund it.



More information about the extropy-chat mailing list