[ExI] What might be enough for a friendly AI?

Samantha Atkins sjatkins at mac.com
Thu Nov 18 17:23:38 UTC 2010


On Nov 17, 2010, at 11:14 PM, spike wrote:

>  
> From: extropy-chat-bounces at lists.extropy.org [mailto:extropy-chat-bounces at lists.extropy.org] On Behalf Of Samantha Atkins
>> … On Behalf Of Florent Berthet
> Subject: Re: [ExI] What might be enough for a friendly AI?
>  
> >>>…It may just be me, but this whole friendliness thing bothers me.
>  
> >>Good.  It should bother you.  It bothers anyone who really thinks about it.
>  
> >>>…I don't really mind dying if my successors (supersmart beings or whatever) can be hundreds of times happier than me…
> …?
>  
> >>Florent you are a perfect example of dangerous person to have on the AGI development team.  You (ad I too) might go down this perfectly logical line of reasoning, then decide to take it upon ourselves to release the AGI, in order to maximize happiness.
>  
> >This is the Cosmist or Terran question.  If you considered it very highly probable that the AGIs would be fantastically brilliant and wonderful beyond imagining AND would be the doom of humanity  then would you still build it or donate to and encourage building it?    I would but with very considerable hesitation and not feeling all that great about it. - samantha
>  
> OK, Samantha now we must add you to the long and growing list of dangerous people to have on the AGI development team.  Your comment makes my point exactly.  To have some member of the team intentionally release the AGI does not require some crazed maniac, no drug addled bumbler, no insanely greedy capitalist.  You are none of these, nor am I (perhaps a sanely greedy capitalist), but honesty compels me to confess I would seriously consider releasing the beast.  With rational players like you, me, Florent, others entertaining the notion, we can be sure that someone on some development team will eventually release the AGI. 

I confess I sometimes get very bored and frustrated being only a somewhat evolved chimp and dealing constantly with the lovable but frustrating yammering of other slightly evolved chimps.  If I had a chance to introduce into the world something much more interesting and quite obviously better then I think it very likely I would do so.

>  
> I am against uploading anyone against her will.  My own actions might depend on whether the AGI can convince me it would not do that.  But I am fully convinced that if silicon based AGI is possible, it will not be contained very long.  Those who work on friendly AGI likely know this too.  Since many of us are atheists, the saying becomes: Good luck and nothingspeed.

I am against it in principle too.  However, if I knew that it was that or species wide calamity with no possibility of any form of continued existence then I would have to consider it.

- samantha

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20101118/4fafbe19/attachment.html>


More information about the extropy-chat mailing list