[ExI] Why “Everyone Dies” Gets AGI All Wrong by Ben Goertzel
    BillK 
    pharos at gmail.com
       
    Wed Oct  1 09:49:48 UTC 2025
    
    
  
Why “Everyone Dies” Gets AGI All Wrong
A Response to Yudkowsky and Soares from the Front Lines of AGI Development
Ben Goertzel    Oct 01, 2025
Being: A reaction to Eliezer Yudkowsky and Nate Soares’s book “If
anybody builds it everyone dies” which is getting a bit of media
attention.
<https://bengoertzel.substack.com/p/why-everyone-dies-gets-agi-all-wrong>
Quote:
An intelligence capable of recursive self-improvement and transcending
from AGI to ASI would naturally tend toward complexity, nuance, and
relational adaptability rather than monomaniacal optimization.
------------------
A good description of why we should survive the arrival of AGI. (probably!).
BillK
    
    
More information about the extropy-chat
mailing list