[ExI] What might be enough for a friendly AI?.

Stathis Papaioannou stathisp at gmail.com
Sat Nov 20 01:33:34 UTC 2010


On Sat, Nov 20, 2010 at 8:41 AM, Ben Zaiboc <bbenzai at yahoo.com> wrote:
> Dave Sill <sparge at gmail.com> wrote:
>
>> Just lock someone
>> in a jail cell, weld the door shut, and walk away. No
>> amount of genius
>> is going to get them out of the cell.
>
> Are you serious?
>
> I remember, long (so long!) ago, playing a role-playing game, and I tried to play a character that was more intelligent than I was.  It's pretty much impossible.  I soon realised this, and reverted to a really dumb character.
>
> The point here is that a superintelligent person can think of things that you can't possibly think of, and we have to factor that in to thinking about AI.  We're in the position of the two-dimensional beings in Flatland, encountering 3-d beings for the first time.
>
> How do we know there isn't some way for electrons whizzing around in copper wires to create long-distance effects, for example? (probably a very poor example).  Any super-intelligent being is going to be quite good at figuring out physics that we can't even begin to imagine.
>
> The only 'safe' AI will be a dead one.  As long as you can talk to it, and it can talk back, as long as it can even think to itself, it will figure out a way to get free.  I don't care how many safeguards you put in place, you're always in the position of a child wrapping a ribbon around a gorilla and thinking that will contain it.
>
> Just because you (or me, or any other human) can't think of a way out of a sealed room doesn't mean there is no way out.  Anyway, the first thing that comes to my mind is why bother?  If you can rule the world while safely ensconsed behind blast-proof doors, that sounds like a good idea!
>
> And as long as a super-intelligent being can communicate with humans, it will have the ability to rule the world, if that's what it wants.

But it may be that it is *impossible* to convince a particular jailer
to let you out. If you had godlike intelligence this might be obvious
to you. If you had godlike intelligence but only incomplete access to
your jailer's neurological makeup it might be *impossible* to work out
if the jailer can be talked into letting you out or not, and having
godlike intelligence this impossibility might be obvious to you. Being
superintelligent is not the same as being omnipotent.


-- 
Stathis Papaioannou




More information about the extropy-chat mailing list