[ExI] What might be enough for a friendly AI?.

Samantha Atkins sjatkins at mac.com
Fri Nov 19 17:44:20 UTC 2010


On Nov 18, 2010, at 7:10 PM, John Grigg wrote:

>> I disagree. It's pretty easy to contain things if you're careful. A
>> moron could have locked Einstein in a jail cell and kept him there
>> indefinitely.
> 
>> -Dave
> 
> Imagine Einstein as a highly trained escape artist/martial artist/spy,
> who is just looking for a means of escape from that jail cell and
> biding his time.  How long do you think the moron jailer will keep him
> there?  I would compare that scenario to humans keeping an AGI as
> their indefinite prisoner.

You have a believable wish granting genie locked in jail.  Worse, said genie knows everything about human psychology including desires and motivations and has mapped that knowledge to your particular self in the blink of an eye.  Now, why do you think you can resist all the temptations and arguments it will make?

- samantha




More information about the extropy-chat mailing list