[ExI] Singularity
Stefano Vaj
stefano.vaj at gmail.com
Mon Nov 15 00:22:21 UTC 2010
On 14 November 2010 18:50, Richard Loosemore <rpwl at lightlink.com> wrote:
> We cannot predict the future NOW,
> never mind at some point in teh future. And there are also arguments that
> would make the intelligence explosion occur in such a way that the future
> became much *more* predictable than it is now!
Let us take physical singularities. We have sometimes good enough
equations describing the evolution of a given system, but only up to a
certain point. There are however limits where the equations break
down, returning infinities or <0 or >1 probabilities, or other
results which have no practical sense. This does not imply any
metaphysical consequences for such states, but simply indicates the
limit where the predictive and descriptive value of our equations
stop.
I do not believe that we need to resort to any more mystic meaning
than this one when discussing historical "singularities". In fact, I
am inclined to describe exactly in such terms past events such as
hominisation or the neolithic revolution.
Moreover, historical developments are not to be taken for granted.
Stagnation or regression or even *real* extinction (of the kind
leaving no successors behind...) are equally plausible scenarios for
our societies in the foreseeable future, no matter what is "bound to
happen" sooner or later in a galaxy or another given enough time.
Especially if... transhumanists are primarily concerned with how to
cope with some inevitable parousia rather than with fighting
neoluddism, prohibitions, technological, educational and cultural
decadence.
--
Stefano Vaj
More information about the extropy-chat
mailing list