[ExI] Why “Everyone Dies” Gets AGI All Wrong by Ben Goertzel

Adam A. Ford tech101 at gmail.com
Fri Oct 3 05:26:05 UTC 2025


>  Getting what we desire may cause us to go extinct
Perhaps what we need is indirect normativity
<https://www.scifuture.org/indirect-normativity/>

Kind regards,

Adam A. Ford

Science, Technology & the Future <http://scifuture.org> -
<http://www.meetup.com/Science-Technology-and-the-Future>YouTube
<http://youtube.com/subscription_center?add_user=TheRationalFuture> | FB
<https://www.facebook.com/adam.a.ford> | X <https://x.com/adam_ford> |
LinkedIn <https://www.linkedin.com/in/adamaford/> | Bsky
<https://bsky.app/profile/adamford.bsky.social> | MU
<http://www.meetup.com/Science-Technology-and-the-Future>


On Thu, 2 Oct 2025 at 03:09, Keith Henson via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> The problem with super-capable machines is not them, it is people.
> What we desire was fixed in the Stone Age.  Getting what we desire may
> cause us to go extinct, as in the Clinic Seed story.
>
> Keith
>
> On Wed, Oct 1, 2025 at 9:32 AM spike jones via extropy-chat
> <extropy-chat at lists.extropy.org> wrote:
> >
> >
> >
> > -----Original Message-----
> > From: extropy-chat <extropy-chat-bounces at lists.extropy.org> On Behalf
> Of BillK via extropy-chat
> >
> >
> > Being: A reaction to Eliezer Yudkowsky and Nate Soares’s book “If
> anybody builds it everyone dies” which is getting a bit of media attention.
> >
> > <https://bengoertzel.substack.com/p/why-everyone-dies-gets-agi-all-wrong
> >
> > Quote:
> > An intelligence capable of recursive self-improvement and transcending
> from AGI to ASI would naturally tend toward complexity, nuance, and
> relational adaptability rather than monomaniacal optimization.
> > ------------------
> >
> > >...A good description of why we should survive the arrival of AGI.
> (probably!).
> > BillK
> >
> > _______________________________________________
> >
> >
> > Ja, and this worries me that a potential harm comes from causing a small
> faction of extremists to take extreme action, as we are seeing politically
> in the USA, by overstating threats.
> >
> > If software goes past humans in "intelligence" it doesn't necessarily
> kill EVERYone.  Granted people might kill each other because of what the
> software does, but that's different.  Some of us anticipated that; we have
> survival and defense strategies in place.  Some of which might actually
> work.  For a while.  Maybe.
> >
> > The Berkeley computer scientists and clubs have been pondering this
> question and have formed strategy groups.  I don't have links but some of
> their lectures and meetings are online in podcast format.  I tuned into one
> realtime a few days ago focused on detection and containment strategies.
> >
> > Also note there are humans on this planet who have never used, perhaps
> never even seen a computer.  They are not dependent on modern electronic
> infrastructure, as we are just to survive (if suddenly without it, most of
> us reading this message would starve within a month or two.)  AGI isn't
> likely to impact their lives much.
> >
> > One possibility is that AGI wrecks the hell out of us, then the
> primitives gradually repopulate the planet.  Then of course their
> descendants make all the same mistakes a coupla hundred years down the
> road.  Humanity gets stuck in a destructive cycle, a kind of Ground Hog
> Century syndrome.
> >
> > SciFi writers among us, you may run with that ball.  Say nice things
> about me for giving you the idea.  I will co-author if you wish.
> >
> > spike
> >
> >
> > _______________________________________________
> > extropy-chat mailing list
> > extropy-chat at lists.extropy.org
> > http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20251003/826f59f8/attachment-0001.htm>


More information about the extropy-chat mailing list