[extropy-chat] Bluff and the Darwin award

Adrian Tymes wingcat at pacbell.net
Tue May 16 19:06:40 UTC 2006


--- Eugen Leitl <eugen at leitl.org> wrote:
> On Tue, May 16, 2006 at 10:08:07AM -0700, Adrian Tymes wrote:
> > What would be the problem with a - very - slow warmup to
> Singularity,
> > taking until 2150 or so?  Say, if it takes 100 years to develop
> fully
> 
> Do you think we can continue for the next 150 years, basically
> as before?

Yep.

Not completely, of course.  Oil alternatives will be developed if (as
seems very likely, either in the near future or in a few decades when
cheap oil supplies dry up) oil prices reach sustained highs - it's
mostly a matter of economics.  The geopolitical landscape will change,
as it routinely has.  Biotech will inexorably advance - at a faster or
slower rate depending in part on social attitudes, but coming up with
more and better ways of feeding the masses (including replacing current
ways once - if not before - their unsustainability means said current
ways stop working: mass famine doesn't hit overnight, and if it hits
industrial areas then a lot of other resources get diverted into food
production by public demand).  And so on, and so forth.  But none of
this absolutely requires Singularity-grade AI.

> > human-equivalent AI software after we have human-equivalent AI
> > hardware, which happens in 2050.
> 
> Using which metric? Why should we have molecular circuitry in 45
> years?
> Sure, it's possible. But I can't put a probability on it.

In that part of my letter, I was tossing it out as a possibility, not a
probability.  ;)

> Not to rain upon your parade; I sure would like to see some action.
> But so far, the delivery is distinctly lacking -- both in style,
> and in substance.

I think you misunderstood.  The proposition given was that the
Singularity must happen this century or the human race is doomed; I was
suggesting that, however desirable a nearer-term Singularity might be,
humanity could well survive if the Singularity did not occur this side
of 2100.  This says nothing about current efforts to implement the
Singularity (although it might help if we take the, "The human race -
including any posthumans and other desendant sapience - will become
extinct if we don't go Singularity ASAP," angle off the table since
that does not in fact appear to be the case).

Besides, some of us are going beyond the prognosticating, and trying to
get to work on actually building tomorrow.  ;)



More information about the extropy-chat mailing list