[ExI] [tt] Identity thread again

Anders Sandberg anders at aleph.se
Wed Mar 25 08:33:33 UTC 2015

Tara Maya <tara at taramayastales.com> , 24/3/2015 5:04 PM:

Therefore it is likely, though very sad and unnecessary, that the existence of copy-clans would lead to war. You can’t introduce an essentially new species of sentient beings into an ecology that so far has only supported one without there being a contest for resources. If we had the ability to populate new areas, in space, or perhaps new nations in sea-steads or something, maybe that contest could be postponed by expanding the ecological niche.

I don't think this is enough of an argument: there have been peaceful resource constrained societies (think Japan during the Edo period), and introducing a new species with *different* resource demands does not necessarily lead to competition. 

Still, in my and Peter Eckersleys paper "Is brain emulation dangerous?" ( http://www.degruyter.com/view/j/jagi.2013.4.issue-3/jagi-2013-0011/jagi-2013-0011.xml ) we listed a bunch of reasons to be concerned:

"The consensus view among those who have studied the history of war is that there is no single agreed major cause for violent conflict. (Levy and Thompson 2009; White 2012) Unfortunately, brain emulation technologies are capable of providing many of the kinds of ingredients that are commonly regarded as contributing to the risk of war (Humphreys 2002; van Evera 2013), including:

* increasing inequality (between emulations, humans who can afford and want to "become" emulations, and humans who cannot);
* groups that become marginalized (humans who cannot compete with emulations, emulations or groups of emulations that are at a disadvantage compared to other
* disruption of existing social power relationships and the creation of opportunities to establish new kinds of power;
* potential first strike-advantages and cumulative resource advantages (holding more resources increases the resource-gathering efficiency);
* the appearance of groups of intelligent beings who may empathise with each other even less than humans historically have done;
* the appearance of groups of beings with strong internal loyalty and greater willingness to "die" for what they value (Shulman 2010);
* particularly strong triggers for racist and xenophobic prejudices;
* particularly strong triggers for vigorous religious objections;
* the creation of situations in which the scope of human rights and property rights are poorly defined and subject to dispute (and surprise)."

These apply to brain emulations in general, with copyclans being a driver for just a few. None of these are perfect arguments: they just modify the likeliehood of conflict, which is also affected by a lot of other things. There are also mitigating factors. In the paper we argue that better computer security makes some of these factors less risky (it is easier to protect oneself, first mover advantages go down, rights can be enforced better), and of course a biological plus software humanity is actually more robust against existential threats than a single kingdom humanity. 

Anders Sandberg, Future of Humanity Institute Philosophy Faculty of Oxford University
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20150325/8e886b69/attachment.html>

More information about the extropy-chat mailing list