[ExI] AGI is going to kill everyone
Darin Sunley
dsunley at gmail.com
Mon Jun 6 15:37:01 UTC 2022
Yudkowsky has been saying versions of this for at least 15 years, and it's
as true now as it was then.
If we aren't already under the complete and absolute control of a
superintelligent AGI (Yes, this is isomorphic to "God exists"), we're all
dead. It really is that simple.
Like the overwhelming majority of people who've been aware of these issues
since the 90's, Yudkowsky is an atheist, so naturally he lacks even this
possibility for the narrowest sliver of optimism.
On Mon, Jun 6, 2022 at 8:38 AM BillK via extropy-chat <
extropy-chat at lists.extropy.org> wrote:
> Eliezer Yudkowsky has written (at last!) a long article listing the
> reasons that Advanced General Intelligence will kill everybody.
> <
> https://www.lesswrong.com/posts/uMQ3cqWDPHhjtiesc/agi-ruin-a-list-of-lethalities
> >
> Quotes:
> AGI Ruin: A List of Lethalities
> by Eliezer Yudkowsky 5th Jun 2022
>
> Crossposted from the AI Alignment Forum. May contain more technical
> jargon than usual.
>
> Here, from my perspective, are some different true things that could
> be said, to contradict various false things that various different
> people seem to believe, about why AGI would be survivable on anything
> remotely resembling the current pathway, or any other pathway we can
> easily jump to.
> -----------------
>
> Over 100 comments to the article so far.
> I would expect that most people will be very reluctant to accept that
> a runaway artificial intelligence is almost certain to kill all humans.
>
> BillK
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20220606/686354e4/attachment.htm>
More information about the extropy-chat
mailing list