[extropy-chat] Singularitarian verses singularity

Robert Bradbury robert.bradbury at gmail.com
Wed Dec 21 20:44:52 UTC 2005


On 12/21/05, Anna Tylor <femmechakra at hotmail.com> wrote:

>
> How can you be activist for something that apparently hasn't happened yet?


By  trying to pick what you want to happen (from the possible things that
can happen) and then working towards specific outcomes you find desirable.

How come it sounds like Nostradamus? Predicting things in advance that
> hasn't happen yet?


There is a difference between predicting the future and creating it.  There
is a famous quote by the well known management consultant Peter F. Drucker
-- "The best way to predict the future is to create it."

For example, there was a job search result today from one of my search
agents was from Deloitte (a large business consulting company) seeking
"Senior Consultant: Vulnerability Assessment and Attack and Penetration".
Such a position is based on determining possible future situations and
designing systems, procedures, etc. which either completely avoid them or
mitigate their negative impact.  One can even view the recent activity on
"preparation" for dealing with Avian (H5N1) flu as falling into this
category.  It involves risk and/or hazard function assesment and dealing
with them before they happen.

How can I be singularitarian when i'm not sure what is singularity?


You can (roughly) be a singulatarian if you believe things are going to
evolve and complexity will increase at an increasing rate.  (Kurzweil's
writings are good sources for this.)  Now as to what "the" singularity is
can be a topic of significant debate.  There are different types of results
that can come out of the singularity process.  One can get into significant
debates as to how the singularity manifests itself (these typically go by
terms like "soft takeoff" and "hard takeoff") -- and believe me over the
years the ExI list has had these discussions at length.

> Knowing it's going to happen doesn't predict how it's going to happen?

"How" it is going to happen is due to the increase in knowledge, complexity
and intelligence (be they be based on wetware (brains) or dryware
(computational capacity, software, AIs, etc.)).

What that does *not* give you is whether or not the path is optimal or
desirable.  For example, given my current understanding of the goals of The
Singularity Institute and how I believe it plans to achieve them I tend to
remain opposed to its efforts.

To some extent, this relates to the points made by Andres Vaccari in his
recent review of TSIN -- there is little, if any, discussion of *how* we
manage the process (in society, as individuals, etc.).  Particularly if one
takes into account the feelings, desires, moral throw weight, etc.  of those
people who have "cast in concrete" non-transhumanistic and/or non-extropic
viewpoints (I consider the two to be distinctly different).

There is plenty which has been written (one can always start with Wikipedia
since they try to keep things relatively brief) when wading into these
swamps.  For more information you have to go back and read things like list
archives or position papers on specific aspects of a topic (or find someone
willing to speak with you directly [at length]).

Robert
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20051221/3f004ea0/attachment.html>


More information about the extropy-chat mailing list