[extropy-chat] Human - Posthuman gap
sjatkins at mac.com
Fri Apr 29 06:21:24 UTC 2005
On Apr 28, 2005, at 5:16 PM, Joseph Bloch wrote:
> Samantha Atkins wrote:
>> On Apr 27, 2005, at 5:20 PM, Joseph Bloch wrote:
>>> I don't see this as being what Eugen is saying at all.
>>> Saying that there will be an end to the Human species, and that they
>>> will be succeeded by some PostHuman successor species, is most
>>> definitely not the same as saying that "there may be no place for
>>> the majority of the people around us".
>>> I think even the most hardcore of us would agree that we would (and
>>> actively do) encourage the transition of each and every person alive
>>> today into a state of PostHumanity. That's why we almost universally
>>> agree on the desirability of making sure Transhumanist technologies
>>> are available to as wide an audience as possible. (Disagreement, of
>>> course, arises based on the question of just what is the most
>>> efficient way to make sure that happens; a relatively
>>> government-free Free Market, or a government-driven model; but the
>>> end result is common to both camps.)
>> Do you think the majority of the people will be interested soon
>> enough? I don't see that as likely. What happens to those folks
>> who aren't buying it?
> I'm not sure what you mean about "soon enough". Much as I abhor
> speculation about the particulars of the coming Singularity (given
> that such a thing by definition makes prediction impossible), I happen
> to think the emergence of a PostHuman species is at least as likely to
> trigger such an event as the development of strong AI or
> nanotechnology (which are often cited as potential triggers).
I only meant soon enough not to get totally creamed economically by the
posthumans if we don't have so much abundance that everyone has what
they need and more than a little of what they desire. Normal humans
skill levels would not likely be very marketable. So what happens to
those people who plain don't want to make he move even if it is free?
If we have practical abundance I see no reason those choosing to remain
human need be in any immediate danger except perhaps psychologically.
I am currently more in favor of and into contributing toward IA and
other aspects of creating posthumans form humans. I believe that
starting with that beginning is more likely to end up well than
starting from scratch as in AI efforts.
> It could well be that within a few months of the emergence of the
> first group of true PostHumans, with their massively enhanced
> intellects and physical forms, a wave of bandwagon-jumping will sweep
> the globe as the pre-PostHumans see the advantages of transitioning,
> leaving behind a relative handful of True Believers as the rest of us
> go jetting off to the far reaches of the Milky Way in our personal
Romantic but I suspect posthumans will emerge gradually and be quite
vulnerable in the beginning. I don't think the ideal outcome is that
the posthumans simply split but it is preferable to destroying
> Or, the numbers of PostHumans could remain relatively small, and they
> could dedicate themselves to the service of their still-unenhanced
> Little Cousins, utilizing a portion of their intellects to that
> purpose as they explore the wider areas of truth and knowledge amongst
> themselves, completely unsuspected by the masses. Or, the first group
> of PostHumans could wipe out the remaining humans, or uplift them
> against their will. The truth is, we have no way of knowing what our
> responses will be when we have an IQ of 10,000, physical immortality,
> and the resources of a small nation.
Then those that care about humanity have quite ample reason to do what
they can to prevent posthumanity from coming into existence. I think
we need a lot better answers and a real idea of and commitment to what
the intent is. I cannot say that becoming posthuman is worth it to
me personally if it likely means the destruction of all but a posthuman
handful of what was humanity. I am in this that everyone on earth may
have undreamed of abundance and opportunity including the opportunity
to become posthuman. I am not interested in the advancment of a
handful to super powered selfish ape statuswho then destroy everyone
> All the more reason to be in the vanguard, sez I. Cuts down on the
> risk of being in the ranks of the exterminated or abandoned.
I would rather be exterminated than exterminate humanity. It is a
matter of choice. We should not try to excuse the ugly choice as what
our mysterious someday super brain might decide. We can decide now what
we are building and what type of being we wish to become. That is the
part that is known and in our control. Every moment we can continue to
> Although I would say that waiting until everyone is guaranteed a seat
> on the train is not an option, mostly because it will NEVER happen.
> Someone will always choose-- through ignorance, superstition (is there
> a difference?), or just plain cussedness-- to remain as they are.
> Those people must not be allowed to hold up our own evolution:
Sounds pretty arrogant. You say you only want the freedom to decide
for you. then you want the freedom to decide for everyone or to
condemn them to dead if they do not agree with you.
> "Your decision to remain the same does not compel me not to change."
But as stated our decision to change may condemn all of humanity except
those who chose as we do because we leave open the option of deciding
to destroy them. If we make no commitment to humanity then why should
humanity aid or even tolerate our existence? Who cares how fast and
deeply we can think or how much raw power we wield if we remain a
stunted disassociated chimps psychologically? The power of gods should
not be given to stunted chimps.
>>> It is a critical question, and also opens up the can of worms of the
>>> desireability/morality/practicality of imposing such a transition on
>>> those who don't, in their pre-PostHuman state, choose to undergo the
>>> transition (in a world where the majority of the population has an
>>> effective IQ of 1000, can someone with a "normal" IQ of 100 make
>>> such a choice and be said to be truly "informed"? Such standards are
>>> relative...) Another sticky wicket appears when those who choose not
>>> to transform themselves seek to impose that decision on everyone.
>> If it is wrong for them to keep us relatively dumb and short-lived
>> would it be right for us to force >human intelligence and
>> indefinitely long life span on them against their wishes?
> To my mind, the answer to your question would be no; to force such a
> transition against someone's will would be immoral. There are two
> caveats, however.
> If it became impossible to attain (or retain) a PostHuman status
> _without_ resorting to such a mass-uplift program (a-la Magneto in the
> first X-Men movie, who sought to turn the world's leaders into mutants
> so they would become sympathetic to the plight of mutants), then I
> would say it would be justified to do so. I will grant that it is
> certainly not a cut-and-dried case. In the situation as I describe it,
> it is literally the evolutionary struggle for survival between two
> species, and given a choice I think forced uplifting is preferable (on
> an individual level) to extermination, if a PostHuman species is faced
> with extermination itself.
We do not have to have an "evolutionary struggle" of the kind you may
have in mind unless we decide to. That is my point. What are we
willing to commit to? How much are we willing to grow to be ready to
command the powers of a god? We must learn to go beyond models that no
longer confine beings such as we are becoming. The non-posthumans can
not even pose a threat a bit further down the road. There is no real
struggle for survival at that point. Until then the question is why
humanity should give birth to us and allow us to grow beyond our
vulnerable stage. Clearly it should not do so without some
assurances as to our subsequent treatment of humanity.
> The second caveat is, I think, even more disturbing to us mere humans
> in its way. Much as we, today, concede that the mentally retarded are
> incapable of making sound judgements for themselves, it may well be
> the case that PostHumanity could see mere humanity in a similar
> position, relatively speaking. Insights and gestalts which are clear
> as crystal to someone with an IQ of 10,000 would doubtless be
> hopelessly confused muddles of ignorance and confusion in our poor
> primate brains. It may well be the case that there is an objective and
> perfectly sound reason for forcing such uplifts that is unknowable to
> someone in our state, the logic and inevitability of which can only be
> appreciated after one has been uplifted. Much as someone with an IQ of
> 50 today would be incapable of making an informed decision to
> _decline_ the opportunity to take a drug that would double their IQ.
That is indeed possible but it is not the assurance that is needed for
our own sanity. To force transcension seems almost a contradiction in
terms. It is likely a matter of much more than merely upgrading the
hardware. Do you really want to bring to full posthuman capability
someone violently opposed? It is far better to offer gentle slopes and
persuasion. With medical nanotech around there is no great hurry for
people to be ready and willing to become posthuman. They can dawdle
for centuries if they wish. They can even die if they wish.
> But of course those caveats are mere speculation. Until we are faced
> with the reality of the situation unfolding before our eyes, the
> original principle, that forced uplifting is immoral, should stand as
> the default position. Only if that position becomes untenable should
> it be reassessed.
>> Do we want to continue this step-wise conundrum indefinitely? Will
>> each step of further progress revisit the question?
> Now you're getting into the "what will the Singularity after the
> Singularity be like?" Obviously, we can't know. But for now, we can
> tentatively say we can apply the same principle, subject to revision
> once our several-orders-of-magnitude-smarter intellects deal with the
There are many steps between here and there and many more I suspect
thereafter. That is part of a the difference between Singularity and
Rapture. Just because we can't see past the Singularity is no reason
to suppose there is no further development on the other side.
>> Who is going to pay for all the upgrades if they aren't all
>> effectively too abundantly available to need bother charging for?
> Here's where the capitalist-vs.-socialist axis within contemporary >H
> comes in. The capitalists (I include here the libertarians, naturally)
> would answer your question by saying that everybody will pay for their
> own upgrades. The socialists would answer by saying the State will
> provide upgrades for everyone. It's a legitimate debate (despite what
> people on BOTH sides of the discussion say), and the reality will
> probably come somewhere in the middle, but from a personal standpoint
> I will tell you that I will be paying for as many upgrades as
> corporations will sell me and the state will allow me to purchase (and
> I can afford, naturally!).
Everyone will not pay for their own if the un-upgraded are no longer
employable. I suggest that part of the price of the birth of
posthumanity is a compact with humanity. Part of the compact may be
that in exchange for our birth we improve the processes and make uplift
available to all who desire it and have prepared or are willing to be
prepared for it. It seem a reasonable thing to ask. So I disagree
with both of the answers above.
>> Wouldn't we then be attempting to force an equality of capability?
>> Is that a good thing?
> I daresay if you asked that question of James Hughes and Max More, you
> would get pretty much diametrically opposed answers. Yet each is a
> Transhumanist in his own way, because that's a question of methods,
> not goals. Personally, I don't think it's practical to force equality
> of capability short of a social system that would make Aldous Huxley
> blush. I favor personal competition, tempered by the intervention of
> society as a whole when such competition spills over into realms where
> it proves detrimental to society as a whole (I am thinking of
> Icelandic feud as an example of a social convention used to regulate
> individual conflict, keeping it from degenerating into whole-scale
> civil war; there are many others).
I think this may be a modeling that we don't carry forward but I could
>> Who says our set of desirability metrics are the most righteous and
>> privileged to the point of forcing others to adopt what we consider
> I don't think we should, with the caveats I give above. Let each have
> his choice, unless their choice means I will be eliminated myself, or
> unless my massively-augmented intellect realizes that those poor
> colloidal-brained primates aren't capable of making those decisions
> for themselves. But I'll defer those questions until they become
It goes both ways. There will be those sooner or later whose abilities
extent beyond anything you wish to pursue. The point of "enough" comes
to even a posthuman. Unless we set out to create a world where to
rest is to be in mortal danger. Again, it is our choice. I will not
chose an infinite hampster wheel of continuous upgrades or else as the
goal of my becoming posthuman. I would sooner run away to a nunnery.
>>> I say give as many people as possible the opportunity to shed their
>>> pre-PostHuman existence, and I am fairly sure that Eugen (and indeed
>>> the vast majority here) would agree on that broad goal.
>> I agree on the goal but it ducks a lot of very thorny questions to
>> just say that much.
> Not at all; it just provides a point of reference from which we _can_
> start talking about the thorny issues, especially since there seem to
> be well-defined camps with different opinions on the best way to
> approach some of them. Which is what we're doing right here (and I
> might argue which is one of the main purposes of these sorts of
> discussion lists, especially this one and WTA-talk).
Yes indeed. Thank you for the conversation.
More information about the extropy-chat