[ExI] Unfrendly AI is a mistaken idea.

Russell Wallace russell.wallace at gmail.com
Tue May 29 03:52:15 UTC 2007


On 5/29/07, Samantha Atkins <sjatkins at mac.com> wrote:
>
>
> This line of reasoning has considerable dark side potential.  We can and
> do go beyond our EP in at least some ways.  If we truly could not do so then
> we would always be untrustworthy cosmic rednecks no matter how augmented we
> someday become.    We would also find it impossible to overcome our EP even
> if it was a matter of our very survival which I think in some ways it is.
>

Mmm, but is "overcome" the right way to come at it? I'm inclined to look at
it as a matter of emphasizing some aspects of EP over others. I mean, we
have the instinct "hate and kill other people [to get rid of competition and
threats or potential threats]", sure, and that's one we understand needs to
be kept on a tight rein. But we also, fortunately, have the instincts "deal
fairly with other people [so they will reciprocate]" and "love and protect
your own [for they share your genes]". Why is the survival and welfare of
humanity my supergoal? It's not a theorem of ZFC. It's because I'm relying
on that third instinct, just generalizing the "your own" part a bit to
include my species rather than only my tribe.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20070529/a7fbce72/attachment.html>


More information about the extropy-chat mailing list