<div dir="ltr">Hi John, my take on this point is similar to my take on space expansion. I would *like* it if the good guys are the first to develop superhuman AI and expand into space, but if the bad guys must be the first, so be it. The universe will provide and make sure things work out well.<br><br>Why have you stopped saying you are an extropian?</div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Sun, Dec 10, 2023 at 4:10 PM John Clark <<a href="mailto:johnkclark@gmail.com">johnkclark@gmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr"><div class="gmail_default" style="font-family:arial,helvetica,sans-serif"><span style="font-family:Arial,Helvetica,sans-serif">On Fri, Dec 8, 2023 at 2:48 AM Giulio Prisco <</span><a href="mailto:giulio@gmail.com" style="font-family:Arial,Helvetica,sans-serif" target="_blank">giulio@gmail.com</a><span style="font-family:Arial,Helvetica,sans-serif">> wrote:</span><br></div></div><div class="gmail_quote"><div dir="ltr" class="gmail_attr"><br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><i><span class="gmail_default" style="font-family:arial,helvetica,sans-serif">> </span>Effective accelerationism (e/acc) is good. Thoughts on effective accelerationism (e/acc), extropy, futurism, cosmism.<br><a href="https://www.turingchurch.com/p/effective-accelerationism-eacc-is" target="_blank">https://www.turingchurch.com/p/effective-accelerationism-eacc-is</a></i></div></blockquote><div><br></div><div><font size="4">I agree with almost everything you said, and I too became a card-carrying extropian in the mid-1990s and until a few years ago I was proud to say I was still an extropian. But today I feel more comfortable saying I'm a believer in effective accelerationism, not because I believe AI poses no danger to the human race but because I believe the development of a superhuman AI is inevitable and the chances that the AI will not decide to exterminate us is greater if baby Mr. Jupiter Brain is developed by the US, Europe, Japan, Taiwan, or South Korea, than if it was developed by China, Russia, or North Korea. If given a choice between low chance and no chance I'll pick low chance every time. </font><br></div><div><font size="4"><br></font></div><div><div class="gmail_default" style="font-family:arial,helvetica,sans-serif"><font size="4" style="color:rgb(80,0,80)"><span class="gmail_default"> </span>John K Clark See what's on my new list at </font><font size="6" style="color:rgb(80,0,80)"><a href="https://groups.google.com/g/extropolis" rel="nofollow" target="_blank">Extropolis</a></font><br></div><div class="gmail_default" style="font-family:arial,helvetica,sans-serif"><font size="1" color="#ffffff">4s8</font></div><div class="gmail_default" style="font-family:arial,helvetica,sans-serif"><br></div></div><div><font size="4"><br></font></div><div><br></div><div> </div></div></div>
<p></p>
-- <br>
You received this message because you are subscribed to the Google Groups "extropolis" group.<br>
To unsubscribe from this group and stop receiving emails from it, send an email to <a href="mailto:extropolis+unsubscribe@googlegroups.com" target="_blank">extropolis+unsubscribe@googlegroups.com</a>.<br>
To view this discussion on the web visit <a href="https://groups.google.com/d/msgid/extropolis/CAJPayv0-Zkr1n1aA%2B1%3DJyNhvmOo19PxmJxYXpjqdDkO7yD4Zyw%40mail.gmail.com?utm_medium=email&utm_source=footer" target="_blank">https://groups.google.com/d/msgid/extropolis/CAJPayv0-Zkr1n1aA%2B1%3DJyNhvmOo19PxmJxYXpjqdDkO7yD4Zyw%40mail.gmail.com</a>.<br>
</blockquote></div>