<div dir="ltr"><div dir="ltr">You strike to the quick of it! He's willing to bomb non-compliant data centers out of existence. After his Time interview, I can't take him seriously any longer. </div><div dir="ltr"><br></div><div>Climate change is another great example, and I would add a third playing out in real time now in the US (and on this list), some have an irrational fear that Trump is an infinite threat and are pretty much willing to do ANYTHING to prevent his re-election.</div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Wed, Mar 20, 2024 at 5:11 AM efc--- via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><br>
<br>
On Wed, 20 Mar 2024, Samantha via extropy-chat wrote:<br>
<br>
> Oh my. Eliezer the luddite. I have been off the opinion for over two <br>
> decades that there is no way for humanity to continue at the current level, <br>
> much less transcend this level, without massively more effective intelligence <br>
> deployed than today whether human or artificial.<br>
><br>
> If this is so then AGI is *essential* to human flourishing.<br>
><br>
> Also "shut it down" means shutting down and controlling the entire internet <br>
> at this point. The work has gone well outside a few large tech companies. <br>
> It is vibrantly alive and being advanced in the open source world. Surely <br>
> Eliezer doesn't believe humanity could survive the level of tyranny it would <br>
> actually take to "shut it down"?<br>
<br>
The problem with Eliezer is that he deals with infinite threats. If you <br>
deal with infinite threats every action is excusable and preferable in <br>
order to avoid infinite evil.<br>
<br>
Just look at the more extreme climate change activists, they deal with <br>
infinite threats (everyone will die tomorrow) and therefore anything is <br>
allowed.<br>
<br>
Another classic is Pascals wager. Assign infinite good and infinite bad, <br>
and all calculation, nuance and rationality goes out the window.<br>
<br>
Best regards,<br>
Daniel<br>
<br>
<br>
> - samantha<br>
><br>
> On 3/13/24 17:31, Keith Henson via extropy-chat wrote:<br>
>> I have been perusing X recently to see what Anders Sandberg and<br>
>> Eliezer have been saying<br>
>> <br>
>> Eliezer Yudkowsky<br>
>> @ESYudkowsky<br>
>> ·<br>
>> 4h<br>
>> I don't feel like I know. Does anyone on the ground in DC feel like<br>
>> making a case? To be clear, the correct action is "Shut it down<br>
>> worldwide" and other actions don't matter; I'm not interested in who<br>
>> tries to regulate it more strictly or has nicer goals.<br>
>> <br>
>> ·<br>
>> 16m<br>
>> Replying to<br>
>> @ESYudkowsky<br>
>> Given the economic, political, and international realities, do you<br>
>> think "shut it down" is possible? If not, are there any other options?<br>
>> You might remember a story I posted on sl4 where a very well aligned<br>
>> AI caused humans to go extinct (though nobody died).<br>
>> <br>
>> <br>
>> Keith<br>
>> <br>
>> _______________________________________________<br>
>> extropy-chat mailing list<br>
>> <a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a><br>
>> <a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
>_______________________________________________<br>
extropy-chat mailing list<br>
<a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a><br>
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
</blockquote></div></div>