[extropy-chat] Futures Past

J. Andrew Rogers andrew at ceruleansystems.com
Mon Oct 10 04:59:06 UTC 2005


On 10/9/05 8:25 PM, "Russell Wallace" <russell.wallace at gmail.com> wrote:
> I'll bite: My estimate is that the Singularity will happen by the end of this
> century, unless we fall into an existential disaster such as nanowar or world
> government instead; so anything you want to do to influence the outcome, you
> probably need to get it done by 2100 or not at all.


The very notion of the Singularity as most transhumanists conceive it is, as
best as I can tell, premised on the same poor reasoning that Harvey is
pointing out.

The Singularity as most people conceive of it essentially requires a hard
take-off(*) of some type in an environment that is not qualitatively
different than what we have now in many respects.  It is the old fallacy of
looking at a technology in isolation and forgetting that every other
technology in society will co-evolve with it.  At what point does the
Singularity occur from the perspective of individuals if they are an
integral part of the system?  In many scenarios it would be as fluid a
transition as the 20th century was.  Sure, to a 2005 human it would appear
abrupt, but at any point in the curve you will have something of human
derivation that has fluidly evolved with the technology and is integrated
with it.  Hell, many of us probably cannot imagine a world without the
modern Internet but I would assume that most of us have lived with that
reality at some point.

I can easily envision a scenario that a 2005 transhuman would call "a
Singularity" that a 2025 transhuman would call "technological progress".
But in 2025, you probably won't have a 2005 transhuman to ask.


(*)One could define "hard take-off" as a scenario where widespread
technological co-evolution does not occur.  In practice it always has, but
there is a popular argument that AGI is qualitatively different as a
technology vector.


J. Andrew Rogers





More information about the extropy-chat mailing list