[ExI] [tt] Identity thread again
William Flynn Wallace
foozler83 at gmail.com
Wed Mar 25 16:45:00 UTC 2015
'Probably there wouldn't be one "best" me but rather hundreds of various
specialized versions, probably buying and selling cognitive modules from
other clans.'
Yes. According to the best available social psychologist (me) there is a
'you' for every other person in the world, though certainly your actions
would correlate among interactions with them, though never perfectly.
According to this interpretation, you are the elephant and everyone else is
a blind person touching you. Hence every person brings out a different
subset of your available actions and does so every time that person
interacts with you. Plus, your subsets change over time, some weakening,
some strengthening, some vanishing, some being added. And of course there
are these variables and more: what you ate, your recent spat with your
mate, the morning news, weather, your missing dog, and thousands more. And
this does not even mention the people in your head, with whom you
interact. That is, the implied presence of others. Put a photo with a
face in a coffee room and donations go up. Just eyes would do it.
Every day is a new day and a new you - somewhat.
If you were uploaded and the program allowed changes to yourself as
described above, the physical you and the uploaded you might be vastly
different in time.
bill w
On Wed, Mar 25, 2015 at 3:33 AM, Anders Sandberg <anders at aleph.se> wrote:
> Tara Maya <tara at taramayastales.com> , 24/3/2015 5:04 PM:
>
>
> Therefore it is likely, though very sad and unnecessary, that the
> existence of copy-clans would lead to war. You can’t introduce an
> essentially new species of sentient beings into an ecology that so far has
> only supported one without there being a contest for resources. If we had
> the ability to populate new areas, in space, or perhaps new nations in
> sea-steads or something, maybe that contest could be postponed by expanding
> the ecological niche.
>
>
> I don't think this is enough of an argument: there have been peaceful
> resource constrained societies (think Japan during the Edo period), and
> introducing a new species with *different* resource demands does not
> necessarily lead to competition.
>
> Still, in my and Peter Eckersleys paper "Is brain emulation dangerous?" (
> http://www.degruyter.com/view/j/jagi.2013.4.issue-3/jagi-2013-0011/jagi-2013-0011.xml
> ) we listed a bunch of reasons to be concerned:
>
> "The consensus view among those who have studied the history of war is
> that there is no single agreed major cause for violent conflict. (Levy and
> Thompson 2009; White 2012) Unfortunately, brain emulation technologies are
> capable of providing many of the kinds of ingredients that are commonly
> regarded as contributing to the risk of war (Humphreys 2002; van Evera
> 2013), including:
>
> * increasing inequality (between emulations, humans who can afford and
> want to "become" emulations, and humans who cannot);
> * groups that become marginalized (humans who cannot compete with
> emulations, emulations or groups of emulations that are at a disadvantage
> compared to other
> emulations);
> * disruption of existing social power relationships and the creation of
> opportunities to establish new kinds of power;
> * potential first strike-advantages and cumulative resource advantages
> (holding more resources increases the resource-gathering efficiency);
> * the appearance of groups of intelligent beings who may empathise with
> each other even less than humans historically have done;
> * the appearance of groups of beings with strong internal loyalty and
> greater willingness to "die" for what they value (Shulman 2010);
> * particularly strong triggers for racist and xenophobic prejudices;
> * particularly strong triggers for vigorous religious objections;
> * the creation of situations in which the scope of human rights and
> property rights are poorly defined and subject to dispute (and surprise)."
>
> These apply to brain emulations in general, with copyclans being a driver
> for just a few. None of these are perfect arguments: they just modify the
> likeliehood of conflict, which is also affected by a lot of other things.
> There are also mitigating factors. In the paper we argue that better
> computer security makes some of these factors less risky (it is easier to
> protect oneself, first mover advantages go down, rights can be enforced
> better), and of course a biological plus software humanity is actually more
> robust against existential threats than a single kingdom humanity.
>
>
>
> Anders Sandberg, Future of Humanity Institute Philosophy Faculty of Oxford
> University
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20150325/f33867b6/attachment.html>
More information about the extropy-chat
mailing list