[ExI] Wars, was identity thread

Keith Henson hkeithhenson at gmail.com
Wed Mar 25 23:09:58 UTC 2015

On Wed, Mar 25, 2015 at 5:00 AM,   Anders Sandberg <anders at aleph.se> wrote:


> "The consensus view among those who have studied the history of war is that there is no single agreed major cause for violent conflict. (Levy and Thompson 2009; White 2012)

I know I am not alone in tagging resource competition as the major
cause for violent conflict.  Azar Gat and Steven A. LeBlanc hold the
same view.  The US Civil war forced me to include anticipation of bad
times a-coming as on the same causal path to war.

The usual path is (1) anticipation of a bleak future, (2) spread of
xenophobic memes through the social group that act to dehumanize the
competing group and synch up the warriors for attacks (3) physical

This has been understood for 900 years.  See the wikipedia article on
Pope Urban II.  Why this isn't widely understood is beyond me.

Religions are xenophobic memes.  People often make the error of a
proximal reason vs the ultimately causal.  It should be immediately
obvious.  I can think of no instances of a population with high
resources to numbers and good future prospects that started a war.  It
is also clear that a population moving into a previously uninhabited
territory don't have a problem with wars until the population growth
starts pressing the resource limits.

> Unfortunately, brain emulation technologies are capable of providing many of the kinds of ingredients that are commonly regarded as contributing to the risk of war (Humphreys 2002; van Evera 2013), including:
> * increasing inequality (between emulations, humans who can afford and want to "become" emulations, and humans who cannot);
> * groups that become marginalized (humans who cannot compete with emulations, emulations or groups of emulations that are at a disadvantage compared to other
> emulations);

These strike me as rather unlikely.  If we have the technology to
upload anyone, the cost will fall like a stone to where it is cheaper
to live uploaded than out of the simulation.

> * disruption of existing social power relationships and the creation of opportunities to establish new kinds of power;
> * potential first strike-advantages and cumulative resource advantages (holding more resources increases the resource-gathering efficiency);

Perhaps, but it seems unlikely to me.  There is also the effect of
speeding up the simulation faster than base level reality.  That has
the effect of making material nearby valuable but that far away
becomes close to worthless.  It also could decouple the base reality
and the simulations economically.  (What do either have that the other
might want?)

> * the appearance of groups of intelligent beings who may empathise with each other even less than humans historically have done;
> * the appearance of groups of beings with strong internal loyalty and greater willingness to "die" for what they value (Shulman 2010);

Not to mention that the warriors can come back from a suicide mission,
perhaps with memory right up to the point they "died."  Of course
given backups, it's not going to be obvious they killed anyone.

> * particularly strong triggers for racist and xenophobic prejudices;
> * particularly strong triggers for vigorous religious objections;

One of the uploaded simulations is sure to be called Heaven.  I can
see it now, preaching against moving to Heaven.

> * the creation of situations in which the scope of human rights and property rights are poorly defined and subject to dispute (and surprise)."

Some of that has already been seen in Second LIfe.

> These apply to brain emulations in general, with copyclans being a driver for just a few. None of these are perfect arguments: they just modify the likeliehood of conflict, which is also affected by a lot of other things. There are also mitigating factors. In the paper we argue that better computer security makes some of these factors less risky (it is easier to protect oneself, first mover advantages go down, rights can be enforced better), and of course a biological plus software humanity is actually more robust against existential threats than a single kingdom humanity.?

I am not so sure about the last.  If the two got into a resource
conflict, chances are only one would survive.  On the other hand, if
the uploaded coveted the deep ocean for cooling, perhaps there would
be little conflict over resources.  I wonder how large a fraction of
the population would upload if it was an option?  For marketing
reasons, it has to be reversible so you can upload for the weekend to
try it out.

I sort of suspect that copies of people will be forbidden by consensus
in most cases.  Perhaps being allowed for those who leave the local
scene never to return.

When you think about it, would you *want* a lot of your copies around?


More information about the extropy-chat mailing list