[extropy-chat] Human - Posthuman gap

Samantha Atkins sjatkins at mac.com
Wed May 4 05:59:33 UTC 2005

On Apr 29, 2005, at 8:16 PM, Joseph Bloch wrote:

> Samantha Atkins wrote:
>> I only meant soon enough not to get totally creamed economically  
>> by the posthumans if we don't have so much abundance that everyone  
>> has what they need and more than a little of what they desire.    
>> Normal humans skill levels would not likely be very marketable.   
>> So what happens to those people who plain don't want to make he  
>> move even if it is free?
>> If we have practical abundance I see no reason those choosing to  
>> remain human need be in any immediate danger except perhaps  
>> psychologically.
>> I am currently more in favor of and into contributing toward IA  
>> and other aspects of creating posthumans form humans.  I believe  
>> that starting with that beginning is more likely to end up well  
>> than starting from scratch as in AI efforts.
> I suppose it depends on the nature of the PostHuman condition. Will  
> it necessarily include "practical abundance"? I don't think that's  
> a requirement, myself, and thus it might well be the case that  
> contemporary economics still function (at least in the near-term  
> after the advent of PostHumanity; I think eventually the scales  
> will tip in favor of "practical abundance" eventually).

I think it is a requirement if the result is going to be livable or  
even achievable without massive suffering and death of our fellow  
human beings.   It was the real possibility of practical abundance  
for all that initially drew me to these things.

> But bear in mind, there are other forms of competition than  
> economics. In the social sphere, PostHumans will have as many if  
> not more advantages over normal humans than they do in the economic  
> sphere. To make a necessarily poor analogy, two PostHumans could  
> interact on a social level as we do today with email and full  
> access to the Internet. A normal human would be writing letters and  
> mailing them, trundling down to the library for any references that  
> might be needed. Imagine a normal human trying to have a  
> relationship with a PostHuman, who is used to being able to share  
> mental experiences as easily as we share files online. "Mere  
> humans" are going to be at a disadvantage in every sphere; not only  
> economic, but social, political, athletic, academic, etc.

Of course but that was not what we were addressing mostly.   Humans  
would not likely mix much with posthumans.

>>> The truth is, we have no way of knowing what our responses will  
>>> be when we have an IQ of 10,000, physical immortality, and the  
>>> resources of a small nation.
>> Then those that care about humanity have quite ample reason to do  
>> what they can to prevent posthumanity from coming into existence.   
>> I think we need a lot better answers and a real idea of and  
>> commitment to what the intent is.     I cannot say that becoming  
>> posthuman is worth it to me personally if it likely means the  
>> destruction of all but a posthuman handful of what was humanity.    
>> I am in this that everyone on earth may have undreamed of  
>> abundance and opportunity including the opportunity to become  
>> posthuman.   I am not interested in the advancment of a handful to  
>> super powered selfish ape status who then destroy everyone else.
> The point is, you and I are literally incapable of imagining what  
> our PostHuman selves would think is appropriate.

Then we give humanity no guarantees or even stated intentions?  We  
tell them that just because we think it will be cool that we will  
make them obsolete and perhaps will do nothing at all toward their  
well-being and may actually - we can't know what we may decide later  
- destroy them all outright?   Please tell me exactly why humanity  
would want to tolerate this.  I am not getting it.  If I am not  
getting it then you can be darn sure that non-transhumanists aren't  

> Speculation, in that case, is useless. We can gush all the  
> platitudes about the dignity of humanity, and respect for those who  
> choose the other path, but once we have transitioned ourselves, all  
> bets are off. Much as the promises you or I might make as a four- 
> year-old cannot seriously be counted on when we're forty.

I am not talking about speculation.  I am talking about commitment.   
There is a whole world of difference.   We talk about personal  
continuity a lot here.  That includes the ability to commit  
contractually.  Should we write into all contracts that the agreement  
is null and void if our intelligence increases more than a certain  

>> I would rather be exterminated than exterminate humanity.  It is a  
>> matter of choice.
> Indeed. While I respect your choice to be exterminated in such a  
> situation, I trust you will respect my choice to resist such a fate.
> Hopefully, of course, it won't come down to such a decision. It  
> certainly doesn't _have_ to; there are many possible scenarios.
> But if does come down to a question of them or us, quite frankly, I  
> choose us.

With the power that posthumans would have there is no way it would be  
such a binary choice.  That is part of my point.

>> We should not try to excuse the ugly choice as what our mysterious  
>> someday super brain might decide. We can decide now what we are  
>> building and what type of being we wish to become.  That is the  
>> part that is known and in our control.  Every moment we can  
>> continue to decide.
> But our future-selves are not bound by those decisions, any more  
> than we are bound by the choices we made in kindergarten.

We are bound if we commit to being so bound.  We are individually  
capable of deciding and living that decision.   A large part of  
becoming posthuman is developing the ability to decide over many more  
aspects of existence than was possible before.  Yes we may decide  
differently at some future time.  I am less concerned with that than  
with what we decide now and are willing to live to.  It is our  
decisions and professed goals now on which we will be judged and  
which will largely determine our near-term fate.

>>> Although I would say that waiting until everyone is guaranteed a  
>>> seat on the train is not an option, mostly because it will NEVER  
>>> happen. Someone will always choose-- through ignorance,  
>>> superstition (is there a difference?), or just plain cussedness--  
>>> to remain as they are. Those people must not be allowed to hold  
>>> up our own evolution:
>> Sounds pretty arrogant.  You say you only want the freedom to  
>> decide for you.  then you want the freedom to decide for everyone  
>> or to condemn them to dead if they do not agree with you.
> The freedom to improve onesself is the ultimate freedom. Is freedom  
> not worth fighting for? And please always bear in mind, this is an  
> outcome Ineither desire nor particularly expect. But to condemn you  
> and I to death, illness, and relative retardation when it is not  
> necessarily inevitable is something that deserves to be resisted.  
> Would you not agree that a group that wanted to kill everyone once  
> they reached the age of 15, and who actively prevented any sort of  
> education, and who held back any medicines, would be a group that  
> should be resisted, and violently if necessary? I see no practical  
> difference between my hypothetical example and those who want to  
> nip Transhumanism in the bud in the name of it's being "unnatural".

This freedom does not free us from making ethical decisions though.   
It actually requires more ethics to be trusted with superhuman  
powers.   I am attempting to point out this out.

> Now, I'm not calling for the Transhumanist Revolution...  
> fortunately it hasn't come to that, and in all likelihood won't. I  
> don't think it's ultimately possible to contain the social and  
> technological trends that are already extant.

Perhaps not "ultimately" but is quite possible to stop much of it for  
some time, possibly more time than most of us have.  The level of  
societal control, interference, surveillance and ability to impose  
against the choices of minorities is increasing.

>> "Your decision to remain the same does not compel me not to change."
>> But as stated our decision to change may condemn all of humanity  
>> except those who chose as we do because we leave open the option  
>> of deciding to destroy them.   If we make no commitment to  
>> humanity then why should humanity aid or even tolerate our  
>> existence?   Who cares how fast and deeply we can think or how  
>> much raw power we wield if we remain a stunted disassociated  
>> chimps psychologically?  The power of gods should not be given to  
>> stunted chimps.
> Self-selected psychology is, of course, one of the elements that is  
> often bandied about as a PostHuman trait.
> But are you arguing for the inclusion of some sort of "we love  
> humanity" meme on the basis of its inherent value, or merely as  
> something that is necessary at the onset of PostHumanity, as a sort  
> of tactical maneuver?

I believe that it is a very basic ethical decision that is a litmus  
test as to whether we deserve to transcend normal human limits.   It  
is also a tactical matter but that is not the reason I suggest it.

>> We do not have to have an "evolutionary struggle" of the kind you  
>> may have in mind unless we decide to.  That is my point.  What are  
>> we willing to commit to?  How much are we willing to grow to be  
>> ready to command the powers of a god?  We must learn to go beyond  
>> models that no longer confine beings such as we are becoming.    
>> The non-posthumans can not even pose a threat a bit further down  
>> the road.  There is no real struggle for survival at that point.   
>> Until then the question is why humanity should give birth to us  
>> and allow us to grow beyond our vulnerable stage.    Clearly it  
>> should not do so without some assurances as to our subsequent  
>> treatment of humanity.
> I was reluctant to indulge my flights of fancy, and this is exactly  
> why. I don't "have in mind" the sort of conflict I described. I was  
> merely putting it out as one of many possibilities.

It is a very real possibility if we refuse ethical commitment.

>> That is indeed possible but it is not the assurance that is needed  
>> for our own sanity.  To force transcension seems almost a  
>> contradiction in terms.   It is likely a matter of much more than  
>> merely upgrading the hardware.   Do you really want to bring to  
>> full posthuman capability someone violently opposed?  It is far  
>> better to offer gentle slopes and persuasion.  With medical  
>> nanotech around there is no great hurry for people to be ready and  
>> willing to become posthuman.  They can dawdle for centuries if  
>> they wish.  They can even die if they wish.
> You and I might agree with that point of view today. But our  
> PostHuman selves might look back on this email and smile  
> condescendingly at our niavete. Remember, I'm just idly speculating  
> here; my point is we can't KNOW what we'll think, and anything we  
> say today could be completely reversed after we're Gods.

That really isn't the question today though.  We only have the power  
of choice Now.

>>> Now you're getting into the "what will the Singularity after the  
>>> Singularity be like?" Obviously, we can't know. But for now, we  
>>> can tentatively say we can apply the same principle, subject to  
>>> revision once our several-orders-of-magnitude-smarter intellects  
>>> deal with the question.
>> There are many steps between here and there and many more I  
>> suspect thereafter.  That is part of a the difference between  
>> Singularity and Rapture.   Just because we can't see past the  
>> Singularity is no reason to suppose there is no further  
>> development on the other side.
> Of course not! There will absolutely be development post- 
> Singularity (more than we can imagine, most likely). But we, by  
> definition have no idea what form it'll take. So unless you're  
> writing for Analog, such speculation is useless. :-)

It is also unnecessary for making an ethical choice today.

>> Everyone will not pay for their own if the un-upgraded are no  
>> longer employable.  I suggest that part of the price of the birth  
>> of posthumanity is a compact with humanity.  Part of the compact  
>> may be that in exchange for our birth we improve the processes and  
>> make uplift available to all who desire it and have prepared or  
>> are willing to be prepared for it.  It seem a reasonable thing to  
>> ask.   So I disagree with both of the answers above.
> What if the "practical abundance" you mentioned above becomes a  
> reality? Then "employable" ceases to be a meaningful category.

Of course.  I suspect it becomes reality because again we decide to  
make it so.

> And I'm all in favor of allowing as many people to transition to  
> PostHumanity as want to. But you seem to be saying that nobody  
> should be able to until everyone is able to. I happen to think that  
> waiting until everyone can partake would be like waiting to build  
> the first car until everyone can have one, or the first PC until we  
> can give one to everyone on the planet. There are going to be  
> "first adopters" of any technology, and I am doing everything I can  
> to not only make sure those technologies become available, but that  
> I'm first in line.

I am not saying that at all.  There indeed are always forerunners and  
early adopters of any technology.   In this case the successful early  
adopters will be in a position to massively improve the technology  
and whatever else they set their minds to that it is possible to  

> I refuse to forego my own ascention on the merest _possibility_  
> that the distribution of such technology is inequitable, waiting  
> until I am assured that everyone gets their immortal intelligence- 
> enhanced body. Is it right that I am denied my PostHuman state  
> because _everyone_ can't do it too? I think not.

You are talking about a position that I am not remotely suggesting.

>>> I daresay if you asked that question of James Hughes and Max  
>>> More, you would get pretty much diametrically opposed answers.  
>>> Yet each is a Transhumanist in his own way, because that's a  
>>> question of methods, not goals. Personally, I don't think it's  
>>> practical to force equality of capability short of a social  
>>> system that would make Aldous Huxley blush. I favor personal  
>>> competition, tempered by the intervention of society as a whole  
>>> when such competition spills over into realms where it proves  
>>> detrimental to society as a whole (I am thinking of Icelandic  
>>> feud as an example of a social convention used to regulate  
>>> individual conflict, keeping it from degenerating into whole- 
>>> scale civil war; there are many others).
>> I think this may be a modeling that we don't carry forward but I  
>> could be wrong.
> After the Singularity? All bets are off. But right now, we need to  
> figure out what sort of pre-Singularity socio-political structure  
> will allow the maximum number of people to ascend to PostHumanity  
> once the time comes. I happen to think a (small-r) republican- 
> capitalist system (as we have here in the US) is optimal, while  
> others think anarcho-capitalism or democratic socialism are the  
> answer. Well, such is the debate in the marketplace of ideas...

While I lean strongly in the staunch libertarian direction I think  
that some of the views of libertarians and all  of the political  
pigeon-holes we use today are hopelessly quaint and confining even on  
this side of Singularity.

>> It goes both ways.  There will be those sooner or later whose  
>> abilities extent beyond anything you wish to pursue.  The point of  
>> "enough" comes to even a posthuman.   Unless we set out to create  
>> a world where to rest is to be in mortal danger.  Again, it is our  
>> choice.  I will not chose an infinite hampster wheel of continuous  
>> upgrades or else as the goal of my becoming posthuman.   I would  
>> sooner run away to a nunnery.
> Abilities I don't want to pursue? What are these words of which you  
> speak? They are foreign to me... ;-)
> As Benjamin Franklin said, "When you're finished changing, you're  
> finished."

Changing does not require an endless chase after ever more.   nor  
does growth.  Happiness especially doesn't require this.

- samantha

More information about the extropy-chat mailing list