[extropy-chat] Re: SIAI: Donate Today and Tomorrow
Hal Finney
hal at finney.org
Sat Oct 23 17:46:08 UTC 2004
Giu1i0 writes:
> Is there a way we could edit our message, without compromising it of
> course, in such a way as to provide *also* psychological comfort?
I'm not sure whose message you mean here. It could be transhumanism,
extropianism, singularitarianism. These are intended to be listed
in somewhat increasing specificity. Transhumanism is a broad goal of
improving humanity; extropianism adds a specific philosophical world
view; and singularitarianism is a very specific path to an improved
future that is actually not particularly extropian or transhuman.
> I will risk heresy and confess that I am beginning to think current
> projects to "engineer a transhumanist religion" (see e.g.
> universalimmortalism.org) are actually good ideas."
I remember Max More saying that he had hopes that extropianism could
evolve into a social and cultural movement that could take the place
of religion in people's minds. It would not just be the kind of
online debating society that this list often becomes. Max talked
about rituals and customs that could fill in for religion. In those
somewhat light-hearted days we saw the extropian salute and the extropian
handshake, Tom O'Morrow's childhood story of Solar Cause. You could
imagine extropian hymns and myths.
The movement hasn't pursued this direction much; I think the increasingly
hostile reaction to transhumanism has forced us onto a war footing perhaps
somewhat earlier than expected. Given the looming threats, these earlier
concepts now seem frivolous and trivial. Computer communication also
tends to be rather cold and lends itself to an analytical mode rather
than the kinds of warm social interactions Max envisioned. For that I
think you need more physical contacts.
> Now I wish to reply to the inevitable accusations of heresy before
> they are formulated, and elaborate some more.
> Imagine a Tiplerian omega-point scenario. Or if you think Tipler's
> physical assumptions are wrong, imagine some other scenario with the
> omega-point property: at some point in the future, a human
> civilization may develop the capability to acquire detailed high
> resolution information from the past (not against casuality), and use
> it to retrieve the information content of human minds in their past,
> perhaps including ourselves here and now. It seems plausible that a
> civilization with that kind of technology will also be able to easily
> upload such information to another body or a virtual environment.
> So we can build a worldview that includes a concept of resurrection
> while at the same time staying compatible with our rational scientific
> worldview.
I'm not sure this is going to work to motivate people, especially in
the context of singularitarianism. First you have the big problem
that many people will object to calling this resurrection, raising
all the philosophical issues of the nature of identity that we argue
over fruitlessly. And second, the singularitarians hope to get their
job done relatively soon, as I understand it. They're not aiming at
success in 50 years. They hope to have results in just a few years,
once they get going. So it would not be a motivation for the average
person that believing in a singularitarian religion will lead to their
own resurrection, unlike conventional religion. Third, if the motivation
is supposed to be that the singularity will resurrect people in the past,
while that is a plus, we can already envision enormous practical benefits
from a successful outcome, and past resurrection would just be icing on
the cake.
I think a more direct formulation of a singularitarian religion is along
the lines Acy James Stapp suggested, where investment is treated like
Pascal's wager. It's like the old joke, where they fire up the big,
all-knowing computer and ask, is there a God? And it says, there is now.
That joke is going to come true if the singularitarian plans work.
As Acy says, if this happens it can't hurt to be one of the guys who
helped bring it about.
Eliezer says he doesn't want to make any promises, but maybe it would
make sense to have a religion which worships the God-to-be, and to make
sure He exists, they are going to build Him. And they can believe that
God will honor the wishes of those who contribute, so long as this is
compatible with His basic goodness and can be done without harming anyone.
Singularitarians already believe that their AI will usher in a world of
ultimate peace and goodness, so I don't see it as that much of a stretch
to call it God.
Hal
More information about the extropy-chat
mailing list