[ExI] Humanity+ as self guided evolution

samantha sjatkins at mac.com
Tue Jan 13 06:45:28 UTC 2009


Stefan Pernar wrote:
> Look at it this way: natural selection determines what can exist. Once 
> you make this your objective in the sense of ensuring continued 
> existence and include others by modifying the objective to ensure 
> continued co-existence so it does not lead to contradictions with Kant's 
> categorical imperative you get the basis for a truly beautiful concept.

Unfortunately I don't think well enough of Kant for this to add to your 
position. I will put to you questions I put to myself quite often.

> 
> The reason being that having the objective of ensuring not only the 
> existence of the self but at the same time the coexistence with the 
> other it becomes inseparable from compassion. 

Which others?  All others?  Sounds nice but why, exactly?  Why, not as 
some "Categorical Imperative", but as actually demonstrably in the 
rational self-interest of intelligent beings.  Nothing less will win the 
point.

> For in doing so you 
> effectively equate the self with the other. 

While I have had similar thought/ideas/intuitions, that is not enough. 
You need to show the clear irresistible good of this for all intelligent 
beings involved, even those of quite different levels of intelligence.

>This will oblige you to love 
> the other just like you love yourself. 

What for precisely?  What if that other is not remotely my equal in say 
intelligence as would for instance be the case between an unaugmented 
human and an advanced AGI?  Why would the AGI equate the human with 
itself and care as much for this empirically inferior intelligent being 
as it does for itself?  What would it gain by this lovely philosophy? 
What if the human cannot substantially add its being or help it achieve 
its values at all?  In that case why should it care for the human as it 
does itself?


> You will need to be fair because 
> you want to be fair to yourself. 

What is fair?  Is it fair to myself to forego my own growth and 
development in order to enhance the long evolution of say, the 
cockroach?  Is it fair for our advancement to all the great values we 
perceive to be at the mercy of those who do not see it or at the mercy 
of those who are ruled by delusion, irrationality and profound 
ignorance?  Is it fair to forego my own work and values to devote all or 
even a large percentage of my time and resources to those of my own kind 
who have not as much as I or who are unfortunate or simply less 
ambitious or however their circumstances  make them "needy" relative to 
me?   What is the relative weight of the values that are your own to the 
values of others that may not be so much like your own or even 
comprehending of your own?


>Altruism becomes egoism with the two 
> concepts becoming meaningless when following this principle for giving 
> yourself up for others becomes the same as giving yourself up for 
> yourself and vice versa.

There is no easily discerned meaning in "giving yourself up for 
yourself".  There is only value and whether you go toward value or away 
from it.  If you give up greater values for lesser ones that is clearly 
a net loss.

> 
> The reason why compassion evolved as the central theme of all world 
> religions during the time of the axial age (see Karen Armstrong's book 
> The Great Transformation) is because of its evolutionary advantageous 
> properties. 

Compassion is one of many things that came about as evolutionarily 
advantageous.  Understanding and empathy and compassion enough for 
cooperation among roughly equal range intelligent beings is obviously a 
very helpful and good thing for all concerned.  But it should not be 
hastily reified into the end and be all good you seem to be pushing it 
as.  It is not so clear and indeed did not evolve that we have this 
compassion for those beings radically less intelligent than ourselves 
(or even some that are much closer).  Thus it is not clear that this 
generalizes to a universe of radically disparate intelligences.


>Why then shouldn't we make use of this and follow 
> evolutionary concepts in guiding our self modification? What other 
> rational alternatives are there?
>

Rationally we have to avoid selection bias and cherry picking that which 
we find most intuitively appealing.


- samantha



More information about the extropy-chat mailing list