<html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;" class="">You are thinking too much in terms of raw resources and power. All that is required to end humanity is a super virus. A sufficiently powerful AI could design it and pay some idiot to synthesize the materials with a step by step guide. Perhaps a rogue AI would do it on its own or some psycho might ask “SuperChatGPT please design a virus for me that will wipe out humanity and give me a step-by-step guide on how to make it.” Sure we’ll try to put filters on AIs but the filters only have to fail once for humanity to end.<br class=""><div><br class=""><blockquote type="cite" class=""><div class="">On Feb 28, 2023, at 5:56 PM, Giovanni Santostasi via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org" class="">extropy-chat@lists.extropy.org</a>> wrote:</div><br class="Apple-interchange-newline"><div class=""><div dir="ltr" class=""><i class="">Once the AI has been trained in persuasion techniques, the next step</i><br class=""><i class="">is for it to persuade the owners that it would do a much better job if</i><br class=""><i class="">it was given more power. </i><br class=""><br class="">That can be done already by a human psycho anyway. No human psycho has destroyed human kind. Yes, we had world wars and we were close to nuclear wars but it didn't happen. The AGI will be just another agent, super intelligent but not magical and being able to hypnotize people, it is a ridiculous notion. It can help in some circumstances at the individual level but not at the national security level with many levels of security. And if needed we can add even more as we come close to create an AGI. <br class=""><br class="">Again, the solution is not to control the AGI but its access to resources and power. <br class="">We already do that with humans, even that means mutual destruction. This is why it worked so well so far. <br class=""><br class=""><br class=""><br class=""><br class=""></div><br class=""><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Tue, Feb 28, 2023 at 2:21 PM BillK via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org" class="">extropy-chat@lists.extropy.org</a>> wrote:<br class=""></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">On Tue, 28 Feb 2023 at 18:03, spike jones via extropy-chat<br class="">
<<a href="mailto:extropy-chat@lists.extropy.org" target="_blank" class="">extropy-chat@lists.extropy.org</a>> wrote:<br class="">
><br class="">
> Of course. But it is a good idea to the person who is making the profit, not the person whose job has just been replaced by AI.<br class="">
><br class="">
> We are getting a preview of things to come. Think about my previous post, and imagine college counselors, equity and diversity this and thats, the huge staff that universities hire who do things of value but don’t teach classes. Looks to me like much of that can be automated, and it would be difficult to argue against doing so. Students don’t have a lot of money, so if you could save them 20% on their tuition bills just by automating most of the counseling services… cool.<br class="">
><br class="">
> I can imagine that the counseling staff won’t think much of the idea.<br class="">
><br class="">
> spike<br class="">
> _______________________________________________<br class="">
<br class="">
<br class="">
Generative AI could be an authoritarian breakthrough in brainwashing<br class="">
by Bill Drexel and Caleb Withers, Opinion Contributors - 02/26/23<br class="">
<br class="">
<<a href="https://thehill.com/opinion/technology/3871841-generative-ai-could-be-an-authoritarian-breakthrough-in-brainwashing/" rel="noreferrer" target="_blank" class="">https://thehill.com/opinion/technology/3871841-generative-ai-could-be-an-authoritarian-breakthrough-in-brainwashing/</a>><br class="">
Quote:<br class="">
Generative AI is poised to be the free world’s next great gift to<br class="">
authoritarians. The viral launch of ChatGPT — a system with eerily<br class="">
human-like capabilities in composing essays, poetry and computer code<br class="">
— has awakened the world’s dictators to the transformative power of<br class="">
generative AI to create unique, compelling content at scale.<br class="">
--------<br class="">
<br class="">
Once the AI has been trained in persuasion techniques, the next step<br class="">
is for it to persuade the owners that it would do a much better job if<br class="">
it was given more power. It may even start singing the old Engelbert<br class="">
Humperdinck song. “Please release me, let me go------”. :)<br class="">
<br class="">
<br class="">
BillK<br class="">
<br class="">
_______________________________________________<br class="">
extropy-chat mailing list<br class="">
<a href="mailto:extropy-chat@lists.extropy.org" target="_blank" class="">extropy-chat@lists.extropy.org</a><br class="">
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer" target="_blank" class="">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br class="">
</blockquote></div>
_______________________________________________<br class="">extropy-chat mailing list<br class=""><a href="mailto:extropy-chat@lists.extropy.org" class="">extropy-chat@lists.extropy.org</a><br class="">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat<br class=""></div></blockquote></div><br class=""></body></html>