<div style="font-family: Arial, sans-serif; font-size: 14px;">It was difficult to watch much of the interview with Eliezer as from the beginning he makes a lot of emotive and extreme claims before any reasoned argument. I believe he knows better. He has written a tremendous amount after all on what is better. </div>
<div class="protonmail_signature_block protonmail_signature_block-empty" style="font-family: Arial, sans-serif; font-size: 14px;">
<div class="protonmail_signature_block-user protonmail_signature_block-empty">
</div>
<div class="protonmail_signature_block-proton protonmail_signature_block-empty">
</div>
</div>
<div style="font-family: Arial, sans-serif; font-size: 14px;"><br></div><div class="protonmail_quote">
------- Original Message -------<br>
On Thursday, March 30th, 2023 at 3:20 PM, Gordon Swobe via extropy-chat <extropy-chat@lists.extropy.org> wrote:<br><br>
<blockquote class="protonmail_quote" type="cite">
<div dir="ltr">I just watched Eliezer's interview with Lex Fridman. It was posted on YouTube today. Worth a watch.<br><br><a href="https://www.youtube.com/watch?v=AaTRHFaaPG8&t=4656s" rel="noreferrer nofollow noopener" target="_blank">https://www.youtube.com/watch?v=AaTRHFaaPG8&t=4656s</a><br><br>-gts</div><br><div class="gmail_quote"><div class="gmail_attr" dir="ltr">On Thu, Mar 30, 2023 at 12:49 PM Darin Sunley via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org" rel="noreferrer nofollow noopener" target="_blank">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex" class="gmail_quote"><div dir="ltr"><div><a target="_blank" href="https://time.com/6266923/ai-eliezer-yudkowsky-open-letter-not-enough/" rel="noreferrer nofollow noopener">https://time.com/6266923/ai-eliezer-yudkowsky-open-letter-not-enough/</a><br></div><div><br></div>We live in a timeline where Eliezer Yudkowsky just got published in Time magazine responding to a proposal to halt or at least drastically curtail AI research due to existential risk fears.<div><br></div><div>Without commencing on the arguments on either side or the qualities thereof, can I just say how f*cking BONKERS that is?!</div><div><br></div><div>This is the sort of thing that damages my already very put upon and rapidly deteriorating suspension of disbelief.</div><div><br></div><div>If you sent 25-years-ago-me the single sentence "In 2023, Eliezer Yudkowsky will get published in Time magazine responding to a proposal to halt or at least drastically curtail AI research due to existential risk fears." I would probably have concluded I was already in a simulation.</div><div><br></div><div>And I'm not certain I would have been wrong.</div></div>
_______________________________________________<br>
extropy-chat mailing list<br>
<a target="_blank" href="mailto:extropy-chat@lists.extropy.org" rel="noreferrer nofollow noopener">extropy-chat@lists.extropy.org</a><br>
<a target="_blank" rel="noreferrer nofollow noopener" href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
</blockquote></div>
</blockquote><br>
</div>