[extropy-chat] Futures Past

Russell Wallace russell.wallace at gmail.com
Mon Oct 10 05:14:26 UTC 2005


On 10/10/05, J. Andrew Rogers <andrew at ceruleansystems.com> wrote:
>
> The very notion of the Singularity as most transhumanists conceive it is,
> as
> best as I can tell, premised on the same poor reasoning that Harvey is
> pointing out.
>
> The Singularity as most people conceive of it essentially requires a hard
> take-off(*) of some type in an environment that is not qualitatively
> different than what we have now in many respects. It is the old fallacy of
> looking at a technology in isolation and forgetting that every other
> technology in society will co-evolve with it. At what point does the
> Singularity occur from the perspective of individuals if they are an
> integral part of the system? In many scenarios it would be as fluid a
> transition as the 20th century was. Sure, to a 2005 human it would appear
> abrupt, but at any point in the curve you will have something of human
> derivation that has fluidly evolved with the technology and is integrated
> with it. Hell, many of us probably cannot imagine a world without the
> modern Internet but I would assume that most of us have lived with that
> reality at some point.
>
> I can easily envision a scenario that a 2005 transhuman would call "a
> Singularity" that a 2025 transhuman would call "technological progress".
> But in 2025, you probably won't have a 2005 transhuman to ask.


Actually I'm inclined to agree with all this; I'm not a believer in the
"superintelligent AI pops out of someone's basement into a world that
otherwise looks mostly like today" scenario, and I was never a great fan of
Singularity definitions based on unpredictability (then we've had zillions
of them) or incomprehensibility (I don't believe anything in this universe
is fundamentally incomprehensible to a sufficiently educated human mind).

>From a navigation perspective, I think what matters is the event horizon:
the point at which we can no longer change the outcome, for good or bad; the
board has been played into either a winning or losing position. That is the
point which I estimate will come before 2100. (I won't be surprised if it
comes as early as 2050; I will be surprised if it comes much before that.)

- Russell
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20051010/ab97a56f/attachment.html>


More information about the extropy-chat mailing list