[ExI] Isn't Bostrom seriously bordering on the reactionary?
hkeithhenson at gmail.com
Fri Jun 17 15:15:42 UTC 2011
On Fri, Jun 17, 2011 at 5:27 AM, Anders Sandberg <anders at aleph.se> wrote:
snip (mostly I agree with Anders)
> Human (and by definition brain emulation) motivation is messy and
> unreliable, but also a fairly known factor.
Hmm. I came late to the game, but I don't see widespread
understanding of the situational and time variance of human
My curiosity about seemingly irrational human behavior, cults and
bonding of kidnapped persons to captors led me to the concept of
capture-bonding (which John Tooby figured out 15 years before I did)
and the rational for genes/irrational for people psychological
mechanisms that are what causes wars.
> Software intelligence based on
> brain emulations also come with human-like motivations and interests as
> default, which means that human considerations will be carried into the
> future by an emulation-derived civilization.
I think a certain amount of caution would be a good idea.
Imagine a society that was uploaded into a somewhat more capable
substrate than brain tissue. The human based entities immediately
recognize a looming resource crisis. Xenophobic memes rapidly
circulate among the communicating brain emulations and after a short
delay (years? days? milliseconds) they attack.
snip (again I mostly agree with Anders)
More information about the extropy-chat