[ExI] Hard Takeoff
Damien Broderick
thespike at satx.rr.com
Wed Nov 17 17:38:17 UTC 2010
On 11/17/2010 10:50 AM, Richard Loosemore wrote:
> the more abstract the goal, the more that the actual behavior of the AGI
> depends on a vast network of interpretation mechanisms, which translate
> the abstract supergoal into concrete actions. Those interpretation
> mechanisms are a completely non-deterministic complex system.
Indeed. Incidentally, Asimov was fully aware of the fragility and
brittleness of his Three Laws, and notoriously ended up with his
obedient benevolent robots controlling and reshaping a whole galaxy of
duped humans. This perspective was explored very amusingly by the
brilliant John Sladek in many stories, and he crystallized it superbly
in two words from an AI: "Yes, 'Master'."
Damien Broderick
More information about the extropy-chat
mailing list