[ExI] Is AGI development going to destroy humanity?

William Flynn Wallace foozler83 at gmail.com
Sat Apr 2 13:58:50 UTC 2022

I cannot seem to find a way to search for AGI without getting 'adjusted
gross income'.  Is there a way?  Just what is AGI?   bill w

On Sat, Apr 2, 2022 at 8:56 AM spike jones via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> -----Original Message-----
> From: extropy-chat <extropy-chat-bounces at lists.extropy.org> On Behalf Of
> BillK via extropy-chat
> Sent: Saturday, 2 April, 2022 4:03 AM
> To: Extropy Chat <extropy-chat at lists.extropy.org>
> Cc: BillK <pharos at gmail.com>
> Subject: [ExI] Is AGI development going to destroy humanity?
> MIRI announces new "Death With Dignity" strategy
> by Eliezer Yudkowsky       2nd Apr 2022
> >...
> <
> https://www.lesswrong.com/posts/j9Q8bRmwCgXRYAgcJ/miri-announces-new-death-
> with-dignity-strategy
> <https://www.lesswrong.com/posts/j9Q8bRmwCgXRYAgcJ/miri-announces-new-death-with-dignity-strategy>
> >
> >...(This article doesn't appear to be an April Fool's joke. Eliezer seems
> to have reached the conclusion that AGI development is going to
> destroy humanity.   BillK)
> Quotes:
> >...It's obvious at this point that humanity isn't going to solve the
> alignment problem, or even try very hard, or even go out with much of a
> fight.  Since survival is unattainable, we should shift the focus of our
> efforts to helping humanity die with slightly more dignity...
> --------------
> BillK
> _______________________________________________
> BillK, this article is classic Eliezer.  We have known him personally since
> we first met him at a local Foresight Institute conference when he was 18,
> which has been 24 years ago.  He is far more convinced than most of us that
> unfriendly AI will destroy humanity.  He made a number of converts to that
> view, including some bright stars on ExI, however I am not among them.  I
> am
> very impressed with his talent and writing skills, his analysis and so
> forth, but I have reasons to doubt the notion of unfriendly AI destroying
> humanity.
> I am eager to discuss the notion in this forum, as we did extensively in
> the
> 90s, for we have a lot more data now, including a critical one: that BI
> (biological intelligence as opposed to artificial) has tried software
> weapons of mass destruction and are trying it constantly, however... we are
> still here, stubborn survivors insisting we are getting better:
> https://www.youtube.com/watch?v=Jdf5EXo6I68
> as some clearly talented AI experts continue promoting their own plausible
> theory, like Ann Elk.
> I can see bigger threats to humanity than runaway unfriendly AI, but none
> of
> these threats will destroy all of humankind.  In all of the grim scenarios
> I
> can easily foresee, most of the African continent survives.
> spike
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20220402/869c1389/attachment.htm>

More information about the extropy-chat mailing list