[ExI] future of warfare again, was: RE: Forking

Anders Sandberg anders at aleph.se
Sat Dec 31 09:47:39 UTC 2011


On 2011-12-30 22:44, spike wrote:
> Keith how does it work in evolutionary psychology as warfare evolves to the
> point where it involves fewer and fewer people, and far fewer actual
> injuries or death?  We are seeing what amounts to warfare between Iran and
> the US/Britain/Israel axis.  Drones are flying over and being apparently
> commandeered and captured, viruses are wrecking their nuclear separation
> facilities, plenty of computerized struggle is reportedly ongoing, but most
> of the population is unaware and unconcerned.  So different is this from
> every previous war in every previous century of human existence.

This is not necessarily a good thing from an evolutionary psychology 
perspective. Military psychology has been struggling to train away 
normal (likely evolved) inhibitions against hurting other people for a 
long time. But doing warfare remotely and automated likely gets rid of a 
lot of inhibitions directly - no direct personal connection to the 
target, various framing effects, biases in moral cognition (consider the 
switch vs. the footbridge cases of the trolley problem), plenty of room 
for diffusion of responsibility and Milgram-experiment-like phenomena. 
Like nonlethal weapons it might lower the threshold for engaging in 
deadly violence.


The actual deadliness of wars doesn't have a strong trend in the data I 
have seen (e.g. 
http://www.aleph.se/andart/archives/2008/07/the_graphs_of_war.html ) but 
there might be a difference in recent high-tech wars. I suspect it is 
not in the increased direction. In fact, even bloody and genocidal 
conflicts are surprisingly "safe" on an individual basis - the risk per 
year is typically about a percent or so.


The real threat from hightech wars is 1) they can be global in scope, 2) 
they can occur very quickly, and 3) they can potentially become very 
lethal.

1 means that dispersion is not easy, that everybody is everybody else's 
neighbor, and that there is no cut-off in the size distribution due to 
their local nature. 2 means that escalation and accidents can happen 
fast, possibly circumventing safety and de-escalation processes. And 3 
means that if there is a shift from the normal focus on limiting the 
opponent's ability to wage war to killing a lot of people for some 
reason (side-effects, tactics, strategy), they could become extremely bad.

(On the hopeful side, I recently got a personal communication from 
researchers investigating nuclear winters who thought that the 
probability of total human extinction due to a full-scale northern 
hemisphere nuclear exchange and ensuing climate disaster was just one in 
100,000 to 10,000. That is cheering! )


-- 
Anders Sandberg
Future of Humanity Institute
Oxford University



More information about the extropy-chat mailing list