[ExI] Natasha's TransH, Libertarians, Singu(Kevin Haskell)

Kevin Haskell kgh1kgh2 at gmail.com
Sun Jun 26 07:15:28 UTC 2011


Hi Natasha,

Quoting <natasha at natasha.cc>

> For me, I am not a libertarian, an anarchist or a singulartarian.  I
> am transhumanist and I support Extropy above all else.  I don't like
> Extropy tethered to other stuff that is not expressly focused on life
> expansion and well being.

>Precisely.  This does not mean that I am not political or that I do not
>support the Singularity. I simply do not subscribe to any one political
>party and I am not what is known as a singulartarian.  There are many
>theorists, experts, activists and knowledgeable individuals of the
>Technological Singularity who are not "singulartarian".

So, you are a Transhumanist who "supports" Extropy, but who is not a
Singularitarian, but who (based on your comments far below,) believes that
one day, Technological Singularity will be achieved  (I saw in a post coming
up from Max, his thought's and the chart in the link that he provided, which
clarified his ideas.)  You would fall within the "Voluntarist Emergent
Surge" category, I assume?

>I think Kurzweil is not optimistic, I think he is an advocate of
exponential
>acceleration as a matter of technological fact.  De Garis is not negative,
I
>think he is presenting a particular theory that is more science fiction
than
>science fact.  I think Goertzel is mostly interested in AGI.  But all in
>all, I think Max More's view on "surges" is the most appropriate
theoretical
>position on a Technological Singularity.

I would say that if Kurweil is advocating that technogical singularity as a
matter of factually happening, it 'is' postitive, because we still are not
positive that other factors will not intercede.  If I am right, and you are
VESist (and I know from your words that I will quote below that you don't
like "isms," but it does seem to fit,here,) then I can see why you might
worry about the 'ultimate' end of this the exponential acceleration, as
least as it happens too soon for your own concerns.

Whether or not De Garis concept is science fiction, I would consider people
killing each other in a war that would kill billions of people 'quite'
negative. Blindly horrific might be a better term.  Further, he seemed very
serious about the idea  when he expressed it in "Transcendent Man," yet
still willing to create AGI (after holding his breath.)  I would think that,
if he thinks that billions of people may die, and you are working toward
bettering human health and better standards of living, that you wouldn't
look at his idea, science fiction or not, as being anything beyond a
negative view how things go.  (Except if you believe the cost is worth the
goal, either way, in which case I could see your point.)

>This is a good question.  I never liked the term "extropians" or
>extropianism" because Extropy is similar to a cybernetic approach
and>within >this approach is the worldview of transhumanism, which is a
>philosophy of Extropy. Certainly other people see that Extropy is the core
>philosophy of transhumanism, which is okay too. But all in all
transhumanism
>cannot exist without Extropy because it is Extropy that presents the
concept
>of continuous expansion, critical thinking and practical optimism.

Based on your description, I would look at it the other way around.  The
only thing that differentiates the two is that Extropy entails 'practical
optimism,' whereas Transhumanism can be optimism, practical optimism,
neutrality, practical negativism, or overtly negative.  This really would
make Transhumanism the more encompassing genus, with Extropy as one of it's
species.

>One issue here is the topic of negentropy, which still should be discussed
and
>revisited in the 21st century.  I don't recall any discussions on it for 10
>years or so.

This sounds like it would be a form of TESism or even DESism; high levels of
human development, falling short of creating the Singularity.

>People do not call themselves extropians today because transhumanism is a
>term that was promoted over Extropy in the late 1990s in order to push the
>political views of the WTA and to promote Huxley as being the originator of
>the ideas, which is entirely incorrect and a political move by WTA that
>backfired on the organization and its principles.  Today we are more even
>minded and Humanity+ has combined the beneficial work of WTA with ExI and
>produced a more even minded organization that is inclusive rather than
>exclusive.

Politics aside, it does sound like the Transhumanist term, whatever it
started out as, has become more encompassing than the term 'Extropy.'

>Nevertheless, the term "transhumanist" is not as scary to the general
public
>as "extropian" and the term Extropy and extropian may gain momentum in
later
>years because thing change and no one really knows what ideas stick or
>terms, etc.

I agree. I think this will help alleviate fears of most of the worlds
population, including many in the scientific communities, that AGI will be
completed and realistically end the human race, rather in a positive way or
negative way. Extropy may become a "safety" code-word for high technology
for human betterment that stops somewhere just short of creation of AGI.

>Regarding Warwick, his ideas seem to fit quite nicely within the
>"Transhumanist" conecept.  Unless I am missing something, he is seeking
>progress in health, length of life, and development of superior qualities
of
>people through the use and physical adaptation of technology. In short, he
>is seeking the evolution of mankind through technology.

>Well, frankly these things have been promoted by transhumanists for
decades!
>t is just recently that the general public has become interested, including
>Kevin.  While Kevin has been deeply engageding in cyborgization of his body
>for a very long time and a forerunner in this domain, the ideas of
>transhumanism are now posted it on his cyborg theory.  But when we think of
>cyorg, it is Manfred Clynes' vision and cybernetics.  I do not know why
>Kevin does not call himself a transhumanist but it seems that is may be
>because  he is deeply invested in the term cyborg for his work.

Yes, people do get vested in terms, especially if they are heavily invested
in fine distinctions, but I can understand because they do better help where
we are all coming from on the spectrum of ideas.

>For goodness sakes, of course!  Someone who works on the Singularity and
>writes about it, etc. is not necessarily a "Singularitarian".

Thank you very much for making this clear.  It means more to me than you
know.

>You are correct and I could have said this, but it seemed obvious to me. I
>apologize.

>I  do not favor the dogma of Singularitarianism because it is a about
"isms"
>and not the Technological Singularity as I learned of it from Vernor Vinge
>and as Extropy Institute introduced it as its conferences in the 1990s.

No need to apologize. It's the beauty of communication, right?   Regarding
the "isms," again, the two I 'created' based on Max's chart just seem to
fit.

>I am very interested in and lecture on the Singularity but I do not call
>myself a Singularitarian because I do not think that superintelligences
will
>kill off our species and I do not think that Friendly AI is the answer, it
>is just one theory.  I do think that humans will merge more and more with
>machines and that humans will integrate with AGI.  I think we will have to
>learn how to accept new intelligences that are not offspring of the homo
>sapiens sapiens species and that will be both difficult and rewarding.

>The central issues  about the Technological Singularity is about how we
>adapt to our future, how we make wise choices, how we diversify and how we
>help others understand what this means and to prepare for it. It will
>happen, but most likely in surges rather than hitting a wall.

>Best,
>Natasha

Yes, we do have to deal with our theories.  I agree that we should attempt
to prepare, make wise choices, and how we help others understand. (The
diversify part, I don't know what you mean.)  But all eyes must be based on
doing whatever is necessary to achieve AGI evolution: Technological
Singularity, and we must do this as possible.  Extropy should be helpful,
but should not slow down the process.

My thought is, we either evolve, and soon, because one way or the other, we
are going to end as a species, so we might as well contribute something
positive to the universe.  Human beings, even evolved ones, are not capable
of escaping the dangers that are harbored in this galaxy alone.

If knowledge is to 'live," it must expand at a pace that Transhumans can't
do, must be able to become agents outside of our galaxy, and must be able to
manipulate space, time, and and all of the other dimensions themselves.
Beginning the process for highly, ever-evolving, ever-expanding, ever-living
intelligence to live, by creating AGI, is the most just and high goal we can
possibly give to the universe, and to life itself as a species.

In a way, it is our duty to create AGI.

Kevin
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20110626/e25791fa/attachment.html>


More information about the extropy-chat mailing list