[extropy-chat] SI morality

Paul Bridger paul.bridger at paradise.net.nz
Fri Apr 16 13:16:01 UTC 2004

Dan Clemmensen wrote:

> Being of a sunny and carefree disposition, and having a "belief" that 
> reason tends to
> "good," I think that the SI will rapidly create a morality for itself 
> that I will consider
> "good." Therefore, I'm in favor of actively accelerating the advent of 
> the SI if possible.

Given all the negative AI scenarios played out in popular culture 
(Matrix, Terminator etc.) I expect the most deadly obstacle to a 
big-bang type Singularity to be fear. All scientific obstacles can be 
conquered by the application of our rational minds, but something that 
cannot be conquered by rationality is...irrationality. However, I also 
expect AI to appear in our lives slowly at first and then with 
increasing prevalence.

Like you, I strongly believe a purely rational artificial intelligence 
would be a benevolent one, but I wouldn't expect most people to agree 
(simply because most people don't explore issues beyond what they see at 
the movie theater). There's a fantastic quote on a related issue from 
Greg Egan's Diaspora: "Conquering the universe is what bacteria with 
spaceships would do." In other words, any culture sufficiently 
technologically advanced to travel interstellar distances would also 
likely be sufficiently rationally advanced to not want to annihilate us. 
I think a similar argument applies to any purely rational artificial 
intelligence we manage to create.

I'm interested: have people on this list speculated much about the 
morality of a purely rational intelligence? If you value rationality, as 
extropians do, then surely the morality of this putative rational 
artificial intelligence would be of great interest - it should be the 
code we all live by. Rationality means slicing away all arbitrary 
customs, and reducing decisions to a cost-benefit analysis of forseeable 
consequences. This is at once no morality, and a perfect morality. 
Hmm...Zen-like truth, or vacuous pseudo-profundity - you decide. :)


More information about the extropy-chat mailing list