<div dir="auto">I don't think we saw deep learning coming, honestly.<div dir="auto"><br></div><div dir="auto">AlphaGo and GPT-3 shocked a lot of people.</div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Mon, Jun 6, 2022, 11:52 AM Adrian Tymes via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr">I seem to be alive, and I have strong reason to believe that many other people are too.<div><br></div><div>How many predictions were there that AGI would kill everyone by, say, 2020?</div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Mon, Jun 6, 2022 at 8:41 AM Darin Sunley via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org" target="_blank" rel="noreferrer">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div>Yudkowsky has been saying versions of this for at least 15 years, and it's as true now as it was then.</div><div><br></div>If we aren't already under the complete and absolute control of a superintelligent AGI (Yes, this is isomorphic to "God exists"), we're all dead. It really is that simple.<div><br></div><div>Like the overwhelming majority of people who've been aware of these issues since the 90's, Yudkowsky is an atheist, so naturally he lacks even this possibility for the narrowest sliver of optimism.</div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Mon, Jun 6, 2022 at 8:38 AM BillK via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org" target="_blank" rel="noreferrer">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">Eliezer Yudkowsky has written (at last!) a long article listing the<br>
reasons that Advanced General Intelligence will kill everybody.<br>
<<a href="https://www.lesswrong.com/posts/uMQ3cqWDPHhjtiesc/agi-ruin-a-list-of-lethalities" rel="noreferrer noreferrer" target="_blank">https://www.lesswrong.com/posts/uMQ3cqWDPHhjtiesc/agi-ruin-a-list-of-lethalities</a>><br>
Quotes:<br>
AGI Ruin: A List of Lethalities<br>
by Eliezer Yudkowsky 5th Jun 2022<br>
<br>
Crossposted from the AI Alignment Forum. May contain more technical<br>
jargon than usual.<br>
<br>
Here, from my perspective, are some different true things that could<br>
be said, to contradict various false things that various different<br>
people seem to believe, about why AGI would be survivable on anything<br>
remotely resembling the current pathway, or any other pathway we can<br>
easily jump to.<br>
-----------------<br>
<br>
Over 100 comments to the article so far.<br>
I would expect that most people will be very reluctant to accept that<br>
a runaway artificial intelligence is almost certain to kill all humans.<br>
<br>
BillK<br>
_______________________________________________<br>
extropy-chat mailing list<br>
<a href="mailto:extropy-chat@lists.extropy.org" target="_blank" rel="noreferrer">extropy-chat@lists.extropy.org</a><br>
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer noreferrer" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
</blockquote></div>
_______________________________________________<br>
extropy-chat mailing list<br>
<a href="mailto:extropy-chat@lists.extropy.org" target="_blank" rel="noreferrer">extropy-chat@lists.extropy.org</a><br>
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer noreferrer" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
</blockquote></div>
_______________________________________________<br>
extropy-chat mailing list<br>
<a href="mailto:extropy-chat@lists.extropy.org" target="_blank" rel="noreferrer">extropy-chat@lists.extropy.org</a><br>
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer noreferrer" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
</blockquote></div>