[ExI] Why “Everyone Dies” Gets AGI All Wrong by Ben Goertzel

spike at rainier66.com spike at rainier66.com
Wed Oct 1 16:30:59 UTC 2025



-----Original Message-----
From: extropy-chat <extropy-chat-bounces at lists.extropy.org> On Behalf Of BillK via extropy-chat


Being: A reaction to Eliezer Yudkowsky and Nate Soares’s book “If anybody builds it everyone dies” which is getting a bit of media attention.

<https://bengoertzel.substack.com/p/why-everyone-dies-gets-agi-all-wrong>
Quote:
An intelligence capable of recursive self-improvement and transcending from AGI to ASI would naturally tend toward complexity, nuance, and relational adaptability rather than monomaniacal optimization.
------------------

>...A good description of why we should survive the arrival of AGI. (probably!).
BillK

_______________________________________________


Ja, and this worries me that a potential harm comes from causing a small faction of extremists to take extreme action, as we are seeing politically in the USA, by overstating threats.

If software goes past humans in "intelligence" it doesn't necessarily kill EVERYone.  Granted people might kill each other because of what the software does, but that's different.  Some of us anticipated that; we have survival and defense strategies in place.  Some of which might actually work.  For a while.  Maybe.

The Berkeley computer scientists and clubs have been pondering this question and have formed strategy groups.  I don't have links but some of their lectures and meetings are online in podcast format.  I tuned into one realtime a few days ago focused on detection and containment strategies.

Also note there are humans on this planet who have never used, perhaps never even seen a computer.  They are not dependent on modern electronic infrastructure, as we are just to survive (if suddenly without it, most of us reading this message would starve within a month or two.)  AGI isn't likely to impact their lives much.  

One possibility is that AGI wrecks the hell out of us, then the primitives gradually repopulate the planet.  Then of course their descendants make all the same mistakes a coupla hundred years down the road.  Humanity gets stuck in a destructive cycle, a kind of Ground Hog Century syndrome.

SciFi writers among us, you may run with that ball.  Say nice things about me for giving you the idea.  I will co-author if you wish.

spike




More information about the extropy-chat mailing list