[ExI] Losing control (was: Unfrendly AI is a mistaken idea.)

Eliezer S. Yudkowsky sentience at pobox.com
Sun Jun 17 08:02:04 UTC 2007


Stathis Papaioannou wrote:
> 
> The most frightening thing for some people contemplating a technological 
> future is that they will somehow be forced to become cyborgs or whatever 
> lies in store.

Yes, loss of control can be very frightening.  It is why many people 
feel more comfortable driving than flying, even though flying is 
vastly safer.

> It is of course very important that no-one be forced to 
> do anything they don't want to do.

Cheap slogan.  What about five-year-olds?  Where do you draw the line?

Someone says they want to hotwire their brain's pleasure center; they 
say they think it'll be fun.  A nearby AI reads off their brain state 
and announces unambiguously that they have no idea what'll actually 
happen to them - they're definitely working based on mistaken 
expectations.  They're too stubborn to listen to warnings, and they're 
picking up the handy neural soldering iron (they're on sale at 
Wal-Mart, a very popular item).  What's the moral course of action? 
For you?  For society?  For a superintelligent AI?

-- 
Eliezer S. Yudkowsky                          http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence


More information about the extropy-chat mailing list