[ExI] AGI is going to kill everyone

BillK pharos at gmail.com
Mon Jun 6 14:36:45 UTC 2022

Eliezer Yudkowsky has written (at last!) a long article listing the
reasons that Advanced General Intelligence will kill everybody.
AGI Ruin: A List of Lethalities
by Eliezer Yudkowsky 5th Jun 2022

Crossposted from the AI Alignment Forum. May contain more technical
jargon than usual.

Here, from my perspective, are some different true things that could
be said, to contradict various false things that various different
people seem to believe, about why AGI would be survivable on anything
remotely resembling the current pathway, or any other pathway we can
easily jump to.

Over 100 comments to the article so far.
I would expect that most people will be very reluctant to accept that
a runaway artificial intelligence is almost certain to kill all humans.


More information about the extropy-chat mailing list