<div dir="auto"><div><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Wed, Mar 20, 2024, 8:27 AM Keith Henson via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">On Tue, Mar 19, 2024 at 6:32 PM Mike Dougherty <<a href="mailto:msd001@gmail.com" target="_blank" rel="noreferrer">msd001@gmail.com</a>> wrote:<br>
><br>
> On Tue, Mar 19, 2024, 9:13 PM Keith Henson via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org" target="_blank" rel="noreferrer">extropy-chat@lists.extropy.org</a>> wrote:<br>
>><br>
>> A point I don't think Eliezer considers is that humans are dangerous<br>
>> Humans are selected for wars, from circulating xenophobic memes when<br>
>> it looks like resources will get tight to irrational optimism about<br>
>> winning wars.<br>
>><br>
>> At least AIs have not been selected for such traits.<br>
><br>
> Do you think majority humans will react to AI in their daily lives as 'stealing jobs' (an artificial scarcity resource) except instead of stealing undocumented workers' work, AI is going to take most of the so-called knowledge work. Doctors will be gone before nurses, but soon after the patients will also adjust to bedside carebots.<br>
<br>
You are not thinking of nanotech medicine which can cure anything<br>
while you are walking around.<br></blockquote></div></div><div dir="auto"><br></div><div dir="auto">Because such nanotech medicine is unlikely to become available to the general public within the next few (say, 5) years. Many people worry that AI will replace a significant number of knowledge workers in that timeframe.</div><div dir="auto"><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
</blockquote></div></div></div>