[ExI] Towards a new transhumanist movement.
Darren Greer
darren.greer3 at gmail.com
Thu Oct 14 09:44:56 UTC 2010
John Grigg wrote:
"I think much of the love for uploading comes from not only the desire
to live forever, but also to be a master of one's own personal
universe in a Star Trek holodeck-like setting. I suspect many people
yearn for the "best possible scenario" where a seedAI goes full-bore
Singularity and lovingly goes around uploading most or all of humanity
so that we can all be saved from disease, aging and death, and have a
great time from then on."
Perhaps not a bad state of affairs, but since I've been reading about it and
paying attention as much as time will allow to discussions on this list
about it, I am slowly coming to a similar conclusion. The way you've put it,
John, reminds me of another cult of personalty personal transformation I've
rejected from about the age of twelve: the idea of heaven. On one hand it
seems too good to be true, and on the other, too boring to be any fun.
I also hate Star Trek, though I loved it as a little kid. My Dad pointed put
to me once that it was a sign of the times that the enemy in the old Star
Trek were purely biological and differed in terms of race (and he even
suggested the Clingons were a suspiciously if exaggeratedly slavic in
feature, which reflected the cold-war enemy on earth at the time) and that
the "new" enemy in the Patrick Stewart version were cyborgs.
Alan Grimes wrote:
"Once the features of the dogma of New Transhumanism are decided upon, me
or someone else will codify them. These texts must then be treated as
absolute dogma. Any disagreement of any core tenant must be treated with
exactly the same derision that someone opposing uploading now faces. New
Transhumanism must then mirror everything the uploaders do. If the
uploaders make a movie, then we must make a movie. If the uploaders
write a blog post, then we publish a blog post. If they say that
everyone will be an upload in 30 years, then we say that everyone will
be X in 30 years."
Absolute dogma? Meet fundamentalism with more fundamentalism? One of the
ways you could start, I suppose, would be to reject anyone out of hand from
this list whose views differed from the accepted party line. You could close
membership in other ways to stifle debate as well. I suspect you'd spend a
lot of time bickering over specific policy and membership requirements and
looking for news ways to win various pissing matches with perceived
opponents, at the expense of genuine scientific speculation and the spirit
of discovery and serendipity that often emerges after heated debate and
philosophical disagreement. For your model you could take Scientology, or,
if you wanted some less radical and more subtle, the methods of the global
warming community that was discussed on here recently. I read those e-mails
from the scientists who did not want their methods or basic tenets
questioned.
As weird as it sounds, and due largely to admitted ignorance of the science,
which I am trying to rectify by going back to school for a scientific
education(which is turning out to be a real blast, by the way, and easier so
far than I thought it would be) I have few opinions on what the future will
be like and whether we'll survive it in our current form or differently or
at all. I came across this quote recently by George Steiner, which I'm sure
some of you are familiar with but was new to me. It is intended as a
warning, I suppose, but for me it was a decent summation of my own need to
just be around for the show.
"We shall, I expect, open the last door in the castle even if it leads,
perhaps because it
leads, onto realities which are beyond the reach of human comprehension and
control. We shall do so with that desolate clairvoyance, so marvelously
rendered in Bartók’s music, because opening doors is the tragic merit of
our identity."
Cheers,
Darren
On Thu, Oct 14, 2010 at 4:42 AM, John Grigg <possiblepaths2050 at gmail.com>wrote:
> Alan Grimes wrote:
> What I've learned is that trying to argue with uploaders is hopeless.
> You can't get them on philosophy, you can't get them on technicalities,
> you can't get them on logistics, and you can't appeal to a stronger
> desire. They literally live for the sake of sticking their brain in a
> meat grinder.
>
> My problem with that is that they've taken over the whole of
> transhumanism, and if the chairmanship of the "Future of Humanity
> Institute" at Oxford University means anything, then they've
> commandeered that too.
>
> If broader transhumanism is to have any chance at gaining a foothold,
> the deathgrip of the uploaders must be loosened.
> >>>
>
>
> This is an ancient argument, but I am fed up with those who feel their
> upload is *them.* At least with conventional concepts of uploading,
> the upload is actually a perfect copy, but not them in a self-circuit"
> continuity way! lol If I get blown up in an aircraft accident, my
> back-up copy is my perfect clone in mind and body, but not *me.*
>
>
> I love the story where two men are arguing about consciousness and
> personal identity and they live in a time when the technologies we
> talk about actually exist. Thomas is convinced that an upload or
> biological back-up is him in every way that matters, arguments about
> continuity be damned. But Rick thinks this is total stupidity and the
> uploads and back-ups are merely copies, no matter how perfect they
> might be. As they argue on and on, Thomas finally states that he is
> so sure
> of himself that he could die and it would not matter since he has a
> back-up on file. Rick takes a big gun out of a drawer, points it at
> Thomas's chest and asks him again if he *really* feels that way...
>
>
> Now I realize that if some godlike post-singularity technology can
> "Tron-style" scan me into it's system, and then later biologically
> restore me, well, I suppose that's different. But I hope not to be
> biologically killed to become an upload, and then killed as an upload
> to become biological again (with the illusion of continuity).
>
>
> I think much of the love for uploading comes from not only the desire
> to live forever, but also to be a master of one's own personal
> universe in a Star Trek holodeck-like setting. I suspect many people
> yearn for the "best possible scenario" where a seedAI goes full-bore
> Singularity and lovingly goes around uploading most or all of humanity
> so that we can all be saved from disease, aging and death, and have a
> great time from then on. But will it actually happen that way? lol
>
>
> As for the transhumanist movement being "taken over" by uploaders, I
> think you are really overstating things. But I do think there are
> those of us who are comfortable with uploading and "fast and lose"
> concepts of personal identity & continuity, and those of us (like me)
> who are not.
>
>
> John
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
--
"I don't regret the kingdoms. What sense in borders and nations and
patriotism? But I miss the kings."
-*Harold and Maude*
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20101014/f6b683d6/attachment.html>
More information about the extropy-chat
mailing list