[ExI] Singularity

Richard Loosemore rpwl at lightlink.com
Sun Nov 14 17:50:30 UTC 2010


> I still believe that seeing the Singularity as an "event" taking place at a
> given time betrays a basic misunderstanding of the metaphor, ony too open to
> the sarcasm of people such as Carrico.
> 
> If we go for the original meaning of "the point in the future where the
> predictive ability of our current forecast models and extrapolations
> obviously collapse", it would seem obvious that the singularity is more of
> the nature of an horizon, moving forward with the perspective of the
> observer, than of a punctual event.
> 
> The Singularity as an incumbent rapture - or
> doom-to-be-avoided-by-listening-to-prophets, as it seems cooler to many to
> present it these days - can on the other hand easily deconstructed as a
> secularisation of millennarist myths which have plagued western culture
> since the advent of monotheism.
> 
> As such, it should perhaps concern historian of religions and cultural
> anthropologists more than transhumanists or researchers.
> 
> --
> Stefano Vaj


I hate to disagree, but ... I could not disagree more. :-)

The most widely accepted meaning of "the singularity" is, as I 
understood it, completely bound up with the intelligence explosion that 
is expected to occur when we reach the point that computer systems are 
able to invent and build new technology at least as fast as we can.

The *point* of the whole singualrity idea is that invention is limited, 
at present, by the fact that inventors (i.e. humans) only live for a 
short time, and cannot pass on their expertize to others except by the 
very slow process of teaching up-and-coming humans.

When the ability to invent is fully established in computational systems 
other than humans, we suddenly get the ability to multiply the 
inventive capacity of the planet by an extraordinary factor.

That moment -- that time when the threshold is reached -- is the 
singularity.

The word may be a misnomer, because the curve is actually a ramp 
function, not a point singularity, but that is just an accident of history.

To detach the idea from all that intelligence explosion context and talk 
about a time at which our ability to predict the future breaks down, is 
vague and (in my opinion) meaningless.  We cannot predict the future 
NOW, never mind at some point in teh future.  And there are also 
arguments that would make the intelligence explosion occur in such a way 
that the future became much *more* predictable than it is now!



Richard Loosemore




More information about the extropy-chat mailing list