[ExI] What might be enough for a friendly AI?.

John Grigg possiblepaths2050 at gmail.com
Sat Nov 20 01:56:15 UTC 2010


Stathis wrote:
> But it may be that it is *impossible* to convince a particular jailer
> to let you out. If you had godlike intelligence this might be obvious
> to you. If you had godlike intelligence but only incomplete access to
> your jailer's neurological makeup it might be *impossible* to work out
> if the jailer can be talked into letting you out or not, and having
> godlike intelligence this impossibility might be obvious to you. Being
> superintelligent is not the same as being omnipotent.

I just hope the jailer *never* makes any kind of mistake...

John  ; )



More information about the extropy-chat mailing list