[ExI] AGI is going to kill everyone

Darin Sunley dsunley at gmail.com
Mon Jun 6 22:07:16 UTC 2022


I don't think we saw deep learning coming, honestly.

AlphaGo and GPT-3 shocked a lot of people.

On Mon, Jun 6, 2022, 11:52 AM Adrian Tymes via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> I seem to be alive, and I have strong reason to believe that many other
> people are too.
>
> How many predictions were there that AGI would kill everyone by, say, 2020?
>
> On Mon, Jun 6, 2022 at 8:41 AM Darin Sunley via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> Yudkowsky has been saying versions of this for at least 15 years, and
>> it's as true now as it was then.
>>
>> If we aren't already under the complete and absolute control of a
>> superintelligent AGI (Yes, this is isomorphic to "God exists"), we're all
>> dead. It really is that simple.
>>
>> Like the overwhelming majority of people who've been aware of these
>> issues since the 90's, Yudkowsky is an atheist, so naturally he lacks even
>> this possibility for the narrowest sliver of optimism.
>>
>> On Mon, Jun 6, 2022 at 8:38 AM BillK via extropy-chat <
>> extropy-chat at lists.extropy.org> wrote:
>>
>>> Eliezer Yudkowsky has written (at last!) a long article listing the
>>> reasons that Advanced General Intelligence will kill everybody.
>>> <
>>> https://www.lesswrong.com/posts/uMQ3cqWDPHhjtiesc/agi-ruin-a-list-of-lethalities
>>> >
>>> Quotes:
>>> AGI Ruin: A List of Lethalities
>>> by Eliezer Yudkowsky 5th Jun 2022
>>>
>>> Crossposted from the AI Alignment Forum. May contain more technical
>>> jargon than usual.
>>>
>>> Here, from my perspective, are some different true things that could
>>> be said, to contradict various false things that various different
>>> people seem to believe, about why AGI would be survivable on anything
>>> remotely resembling the current pathway, or any other pathway we can
>>> easily jump to.
>>> -----------------
>>>
>>> Over 100 comments to the article so far.
>>> I would expect that most people will be very reluctant to accept that
>>> a runaway artificial intelligence is almost certain to kill all humans.
>>>
>>> BillK
>>> _______________________________________________
>>> extropy-chat mailing list
>>> extropy-chat at lists.extropy.org
>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>
>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20220606/55b9b348/attachment-0001.htm>


More information about the extropy-chat mailing list