FW: [extropy-chat] Singularitarian verses singularity

Brandon Reinhart transcend at extropica.com
Wed Dec 21 23:42:03 UTC 2005


Argh. My original post never went through, but my reply to myself did.
Reposted:

-----Original Message-----
From: Brandon Reinhart [mailto:transcend at extropica.com] 
Sent: Wednesday, December 21, 2005 2:19 AM
To: 'ExI chat list'
Subject: RE: [extropy-chat] Singularitarian verses singularity

> How can you be activist for something that apparently hasn't happened yet?

You begin by studying the logic, the science, and the theories that form the
basis for a belief in or desire of the singularity. If you come to the
conclusion that a singularity is possible (or inevitable) then you almost by
extension become an activist.

Whether you are talking about the birth of a strong AI (Vinge
singularity/hard takeoff) or a slow takeoff due to technological growth
(Kurzweil singularity) you have a scenario with the potential to cure
countless diseases, save countless lives, and improve the welfare of society
greatly. Even low shock level scenarios include extremely long lives, cures
for major human ailments, etc. Why would any human, who has come to the
singularity conclusion >>through the application of reason<<, not extend
that processing of reasoning to conclude that it is ideal to work to bring
about a singularity as quickly and as safely as possible?

Even if you do not accept a hard takeoff singularity scenario, it is still
reasonable to conclude that destroying deathist taboos and approaching human
mortality as a disease to be cured (or engineering problem to be solved) is
good for our society. It is still reasonable to conclude that intelligence
enhancement would lead to a better society.

Even if futurist projections of a singularity in this century are way off,
it still makes sense, in principle, to pursue the goal of building a
rational, technology driven society that seeks to uphold and protect
extropian values.

> How come it sounds like Nostradamus?

Because a cursory examination only reveals the quick pay off. You see
"technological apotheosis" and your brain says "bullshit!" Which is good:
skepticism is the extropian's most valued tool. The ability to filter shit
is indispensable. It drives you to check the facts; to study the theories or
to do the science.

Believe me, when I started reading about the possibility of
ultratechnologies far beyond my shock level, it turned on all sorts of
warning sirens. I do not accept any prediction of a time frame for
singularity.

> How can I be singularitarian when I'm not sure what is singularity?

You can't. You don't want to be. Don't aspire to be an activist until you
know what you are working to achieve. This is a simple rule for any kind of
activism. If you blindly accept everything written in "The Singularity is
Near" you aren't helping the cause. You have to understand why you believe
what you believe.

Build up your knowledge slowly. Read a lot. Think about what you read. Write
about what you think. Digest the concepts and apply your skepticism. If
something doesn't seem to fit in your rational view of the world, figure out
why: is it too grandiose? Too unlikely? Too crackpot? Too illogical? Too
dangerous? Too undesirable? Too desirable? Put the ideals and theories that
comprise extropianism, transhumanism, and singularitarianism through your
intellectual wringer and the ones that come out still standing...well, take
those and build on them.

If you find terms or concepts you don't understand, break them down. Don't
just accept it because some futurist said it would be so. I think if you're
a true rationalist, this process of breaking down ideas will be something
you do all your life. That's part of the "perpetual progress" that defines
the extropian.

When you read a post on this list, don't let it stand as some great
statement of transhumanist wisdom. Think about it. Challenge it. Meme
osmosis is the enemy.

There is no single great conception of what the singularity is or what it
will be like or when it will happen. There is only a community of common
values and goals: progress, self-transformation, optimism, intelligent
technology, open society, self-direction, and rational thinking.

> Knowing it's going to happen doesn't predict how it's going to happen?

We don't know for certain the singularity will happen.

There are plenty of singularitarians that see the singularity as a time of
immense risk to the survival of our biosphere. We may very well destroy
ourselves before we get the chance for a singularity.

Or we may find that there are challenges in the science we didn't expect.
The singularity might not happen in our lifetimes.

It doesn't really matter. The benefits of a successful singularity are
important enough to work toward even if we don't reach it soon, that we
should work toward it anyway. The extropian ideology can be applied on an
ongoing basis, regardless of your current technology level.

You can utterly dispense with the singularity, call it complete nonsense,
and still be an extropian. Draw your own lines in the sand. Don't "buy into"
the singularity simply because it's the popular thing to do. Do it because
it fits into your view of the world you want to help build.

- Brandon




More information about the extropy-chat mailing list