[ExI] Unfrendly AI is a mistaken idea.

Lee Corbin lcorbin at rawbw.com
Wed May 30 02:22:33 UTC 2007


Stathis writes

> On 29/05/07, John K Clark <jonkc at att.net> wrote:
> 
> > That's because the dictator's interests and his genes interest are not the
> > same; however a dictator would do everything he can to increase his power
> > because if he doesn't take advantage of every opportunity some dictator 
> > wannabe will.
> 
> Couldn't you say the same about his expansionary urges as his reproductive
> urges? Men who tried to have as many children as possible would over the
> years have come to dominate the gene pool,

I think that men did try to have as many children as possible, for a while, until
women wised up. (I mean that quite seriously;  it's a theory of evolutionary
history that an "arms-race" developed between women who can have only
relatively few children, and men who can have many).

> but that hasn't happened.

It's still "trying to happen".  A number of men have genes (and cultural
influences) that cause them to have many more children that they
can contribute to the raising and support of.  In fact, it's becoming
more widespread in the current era, because governments now will
support all children when the parents cannot.  Hence it currently
"pays" men in evolutionary terms to have as many children as possible,
and the genetic part of this tendency is naturally very fit and is spreading.

There was a time when men who had too many children that they
could not support were sanctioned by society to such a degree
that their genetic tendencies to do so actually became less frequent
in the population.  Or, cultures and societies that inflicted such
sanctions expanded and prospered at the expense of societies
that did not.

> How do you decide which part of an entity to break off and say that
> its interests are not those of the greater entity? 

I'm not sure of your meaning here. If you are talking about a "dictator's
expansionary urges", then the argument John gave should work. A
dictator's survival will be favored, and his country will flourish, unless
he's fearful that further aggrendizement will be punished somehow.

In any case, are you suggesting that an AI who, say, took over a part
of the solar system would envision some "greater good" that would
prevent it from taking over it all?

Lee




More information about the extropy-chat mailing list