[extropy-chat] Eugen Leitl on AI design

Eliezer Yudkowsky sentience at pobox.com
Wed Jun 2 20:21:11 UTC 2004


Adrian Tymes wrote:

> --- Eliezer Yudkowsky <sentience at pobox.com> wrote:
> 
>> Let the resources of Robert Bradbury's body be sufficient to produce
>> 10^4 paperclips, while the other resources of the Solar System are
>> sufficient to produce 10^26 paperclips.  The paperclip maximizer 
>> evaluates the options:
>> 
>> A:  Spare Robert Bradbury.  Expected paperclips 10^26. B:  Transform
>> Robert Bradbury into paperclips. Expected paperclips 10^26 + 10^4.
>> 
>> Since A < B, the paperclip maximizer will choose B.
> 
> Ah, but that's ignoring resources consumed in trying to convert Robert
> Bradbury, who is considerably more resistant to being transformed into
> paperclips than a mere chunk of rock.

If the paperclip maximizer expects to win, it will try.  So says the math. 
  If the paperclip maximizer is a superintelligence, it will correctly 
expect to win.

> And the possibility that Robert 
> Bradbury's intelligence, if added to the AI's own, could come up with a
> way to make 1.1*10^26 paperclips from the Solar System's resources - a
> bounty that is well worth foregoing a mere 10^4 paperclips. 

You must be joking.  A human brain beat a superintelligence?  We are not 
such hot stuff on the scale of minds in general.  In the unlikely event 
that Robert Bradbury's current physical state contains information that the 
paperclip maximizer expects to be of relevance to producing paperclips, the 
paperclip maximizer would read out only that information which it needed 
during Bradbury's disintegration, use the information to produce paperclips 
I can't imagine how, and discard the information afterward.

Whichever action leads to the largest number of paperclips will be taken. 
That is the math of the paperclip maximizer and it is as cruel as the math 
of natural selection.  It is helpful, in understanding paperclip 
maximizers, to have studied evolutionary biology with math.  For the 
evolutionary biologists go to similarly great lengths to hammer out those 
warm and fuzzy hopes with which people often approach natural selection.

-- 
Eliezer S. Yudkowsky                          http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



More information about the extropy-chat mailing list