[ExI] Hard Takeoff

Stefano Vaj stefano.vaj at gmail.com
Mon Nov 15 00:09:36 UTC 2010


2010/11/14 Michael Anissimov <michaelanissimov at gmail.com>:
> I disagree about the relative risk, but I'm worried about this too.

"Risk" is a concept which requires a definition of what is feared, why
it is feared and whether really it makes sense to make efforts to
avoid it.

If you think about that, previous human generations have routinely
been stolen the control of society by subsequent ones who have
sometimes killed them, other times segregated them in "retirement"
roles and institutions, expelled them from creative work, made them
dependent on others' decisions, alienated them from their contemporary
cultures, and so forth.

At the same time, I have never heard such circumstances expounded as a
rationale for drastic birth control or children lobotomy.

Now, while I think that some scenarios with regard to "AGI" are
grossly anthropomorphic and delusionary, what exactly could our
hypothetical "children of the mind" do worse than our biological
children?

If anything, "human-mind" emulation and replication technology might
end up being more protective of our legacy - see under mind
"uploading" - than past history has ever been for our predecessors. Or
not. But technology need not be "antropomorphic" to be dangerous.
Perfectly "stupid" computers can be as dangerous, or more, than
computers emulating some kind of human-like agency, whatever the
purpose of the latter might be.

-- 
Stefano Vaj



More information about the extropy-chat mailing list