[ExI] Are we too stupid? (Was: Cold fusion)

Stefano Vaj stefano.vaj at gmail.com
Sun Jun 2 13:20:36 UTC 2013


On 27 May 2013 18:45, Anders Sandberg <anders at aleph.se> wrote:

> This is actually a really good question. How smart do you have to be in
> order to trigger a singularity?
>

Let us give a more Nietzchean turn: how much WILL do you have to summon up
to trigger a singularity?

My own take of historical singularities is closer than average to the
physical origin of the concept: a singularity is simply the threshold where
our predictive tools and theories collapse, not a somewhat mystical place,
time or event where actually we are faced with infinite quantities,
probabilities higher than one or lower than zero, or other mathematical
artifacts simply showing the inadequateness of our (current) equations.

Accordingly, it is only natural that singularities are in the nature of
horizons, constantly receding before those approaching them.

OTOH, a singularity, a Zeit-Umbruch can well be identified in catastrophic
changes of status, in paradigm shifts, in radical breakthroughs changing
the very nature and sense of our "being in the world". Now, I have been
persuaded since the eighties that we do stand on the verge of one such
metamorphosis.

Only, nothing prevents us from staying there forever (that is, until a
merciful extinction terminates a presence having so become devoid of
sense), and my concern is that powers and worldviews willingly and
deliberately embracing stagnation, à la Brave New World, remain, or rather
become more and more, dominant in our societies.

Transhumanism for me is basically the ideology of those embracing change
and self-overcoming as a fundamental ethical option, and offer that as an
alternative to the humanist biased.

--
Stefano Vaj
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20130602/294f30be/attachment.html>


More information about the extropy-chat mailing list