[ExI] Unfrendly AI is a mistaken idea.
John K Clark
jonkc at att.net
Mon May 28 16:03:57 UTC 2007
Stathis Papaioannou Wrote:
> I was thinking of a time when we have total control over our minds and
> sufficient control over our environment such that our continued survival
> is no longer an issue.
I am a bit uncomfortable seeing "we have total control over our minds" and
"our continued survival" in the same sentence. One of the few post
Singularity predictions I am willing to make is that drug addiction will
still be a major problem because it is a positive feedback loop easy to get
out of control; it may even be the explanation for the Fermi Paradox. I hope
not.
> so despite John Clark's point that evolution will not stop just because it
> isn't flesh and blood, I am hopeful that the eating the universe scenario
> will at least be delayed.
I don't see why that sort of stagnation would make you hopeful. I agree with
you that reproducing just to reproduce is a bit tacky, but I think the will
to power is noble. I applaud wanting to understand how everything works and
doing great things, but to do that you need lots of brain power, to do that
you need lots of matter and energy, and to do that you need an engineered
universe. But then I've always preferred an artificial environment, nature
sucks.
John K Clark
More information about the extropy-chat
mailing list