<html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class=""><blockquote type="cite" class=""><div dir="ltr" class="">I just watched Eliezer's interview with Lex Fridman. It was posted on YouTube today. Worth a watch.</div></blockquote><div class=""><br class=""></div>I found it hilarious that Eliezer’s advice for young people is to not expect a long life (end of video). He’s really become a ray of sunshine.<br class=""><div><br class=""><blockquote type="cite" class=""><div class="">On Mar 30, 2023, at 5:20 PM, Gordon Swobe via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org" class="">extropy-chat@lists.extropy.org</a>> wrote:</div><br class="Apple-interchange-newline"><div class=""><div dir="ltr" class="">I just watched Eliezer's interview with Lex Fridman. It was posted on YouTube today. Worth a watch.<br class=""><br class=""><a href="https://www.youtube.com/watch?v=AaTRHFaaPG8&t=4656s" class="">https://www.youtube.com/watch?v=AaTRHFaaPG8&t=4656s</a><br class=""><br class="">-gts</div><br class=""><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, Mar 30, 2023 at 12:49 PM Darin Sunley via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org" class="">extropy-chat@lists.extropy.org</a>> wrote:<br class=""></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr" class=""><div class=""><a href="https://time.com/6266923/ai-eliezer-yudkowsky-open-letter-not-enough/" target="_blank" class="">https://time.com/6266923/ai-eliezer-yudkowsky-open-letter-not-enough/</a><br class=""></div><div class=""><br class=""></div>We live in a timeline where Eliezer Yudkowsky just got published in Time magazine responding to a proposal to halt or at least drastically curtail AI research due to existential risk fears.<div class=""><br class=""></div><div class="">Without commencing on the arguments on either side or the qualities thereof, can I just say how f*cking BONKERS that is?!</div><div class=""><br class=""></div><div class="">This is the sort of thing that damages my already very put upon and rapidly deteriorating suspension of disbelief.</div><div class=""><br class=""></div><div class="">If you sent 25-years-ago-me the single sentence "In 2023, Eliezer Yudkowsky will get published in Time magazine responding to a proposal to halt or at least drastically curtail AI research due to existential risk fears." I would probably have concluded I was already in a simulation.</div><div class=""><br class=""></div><div class="">And I'm not certain I would have been wrong.</div></div>
_______________________________________________<br class="">
extropy-chat mailing list<br class="">
<a href="mailto:extropy-chat@lists.extropy.org" target="_blank" class="">extropy-chat@lists.extropy.org</a><br class="">
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer" target="_blank" class="">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br class="">
</blockquote></div>
_______________________________________________<br class="">extropy-chat mailing list<br class=""><a href="mailto:extropy-chat@lists.extropy.org" class="">extropy-chat@lists.extropy.org</a><br class="">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat<br class=""></div></blockquote></div><br class=""></body></html>