[ExI] Yudkowsky in Time on AI Open Letter.

Adrian Tymes atymes at gmail.com
Thu Mar 30 18:58:26 UTC 2023


On Thu, Mar 30, 2023 at 11:49 AM Darin Sunley via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> We live in a timeline where Eliezer Yudkowsky just got published in Time
> magazine responding to a proposal to halt or at least drastically curtail
> AI research due to existential risk fears.
>
> Without commencing on the arguments on either side or the qualities
> thereof, can I just say how f*cking BONKERS that is?!
>

Welcome to the future we've been fighting for, where things like AI and
existential risk are taken seriously.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230330/16984456/attachment-0001.htm>


More information about the extropy-chat mailing list