[extropy-chat] Proof that a paperclip maximizer cannot be a general intelligence

Marc Geddes marc_geddes at yahoo.co.nz
Thu Aug 18 09:09:33 UTC 2005


Is the following sentence ‘True’ or ‘False’?

“Either I did not just carry out the goal of
understanding this sentence or the goal ‘Maximize
paperclips’ is not the goal with the highest utility.

Suppose you asked the paperclip maximizer whether it
thought that the sentence was ‘True’ or “False’.  To
do this the system would have to have understood the
sentence.  So it would have to have carried out the
goal of understanding it.

But the sentence said that IF one just carried out the
goal of understanding it, the goal ‘Maximize
paperclips is not the goal with the highest utility’. 
Therefore, a ‘True’ 
answer would mean that the system agrees that
‘Maximize paperclips is not the goal with the highest
utility’.  But this would contradict the notion that
the system is a paperclip maximizer.  Therefore the
system cannot say ‘True’

Is the sentence actually meaningful though?  Yes. 

Tarksi’s resolution of logical paradoxes does not
apply here, because both clauses in the sentence are
referring to things on the same logical level: namely
*carrying out goals*.

Since the sentence is meaningful and since a real
general intelligence CAN see that the statement is
true, this proves that a paperclip maximizer cannot be
a true general intelligence.


---

Please vist my website:
http://www.riemannai.org

Science, Sci-Fi and Philosophy

---

THE BRAIN is wider than the sky,  
  For, put them side by side,  
The one the other will include  
  With ease, and you beside. 

-Emily Dickinson

'The brain is wider than the sky'
http://www.bartleby.com/113/1126.html

Send instant messages to your online friends http://au.messenger.yahoo.com 



More information about the extropy-chat mailing list