[ExI] Yudkowsky in Time on AI Open Letter.

Darin Sunley dsunley at gmail.com
Thu Mar 30 18:46:53 UTC 2023


https://time.com/6266923/ai-eliezer-yudkowsky-open-letter-not-enough/

We live in a timeline where Eliezer Yudkowsky just got published in Time
magazine responding to a proposal to halt or at least drastically curtail
AI research due to existential risk fears.

Without commencing on the arguments on either side or the qualities
thereof, can I just say how f*cking BONKERS that is?!

This is the sort of thing that damages my already very put upon and rapidly
deteriorating suspension of disbelief.

If you sent 25-years-ago-me the single sentence "In 2023, Eliezer Yudkowsky
will get published in Time magazine responding to a proposal to halt or at
least drastically curtail AI research due to existential risk fears." I
would probably have concluded I was already in a simulation.

And I'm not certain I would have been wrong.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230330/4519b2eb/attachment.htm>


More information about the extropy-chat mailing list