<div dir="ltr"><br><div><br></div><div>I don't see any of this as a problem at all. You just need to find a way to build and track consensus around what EVERYONE wants. And then use a sorting algorithm which gives more vote to less rich people and stuff like that. (only a minor vote to AI systems or systems emulating dead people...?) After all, if you know what everyone wants, THAT, by definition is consensus. And SAIs will help us know, better, what we as individuals really want and how to be just and fair with it all.</div><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div></div><br><div class="gmail_quote gmail_quote_container"><div dir="ltr" class="gmail_attr">On Fri, Oct 3, 2025 at 3:37 AM BillK via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr"><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Fri, 3 Oct 2025 at 06:26, Adam A. Ford <<a href="mailto:tech101@gmail.com" target="_blank">tech101@gmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div>>
Getting what we desire may cause us to go extinct</div><div>Perhaps what we need is <a href="https://www.scifuture.org/indirect-normativity/" target="_blank">indirect normativity</a></div><div><br></div><div><div dir="ltr" class="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div dir="ltr"><div>Kind regards,<span class="gmail_default" style="font-family:arial,sans-serif;font-size:small;color:rgb(0,0,0)"> </span>Adam A. Ford<br><div><font size="1"> </font><font style="font-family:verdana,sans-serif" size="1"><span style="color:rgb(102,102,102)"><a href="http://scifuture.org" target="_blank">Science, Technology & the Future</a></span><span style="color:rgb(102,102,102)"> </span></font></div></div><div><div>
</div>
</div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div>_______________________________________________</div></blockquote><div><br></div><div><br></div><div style="font-family:arial,sans-serif;font-size:small;color:rgb(0,0,0)" class="gmail_default">Yes, everybody agrees that AI alignment is a problem that needs to be solved. :)</div><div style="font-family:arial,sans-serif;font-size:small;color:rgb(0,0,0)" class="gmail_default">And using Initial versions of AI to assist in devising alignment rules is a good idea. After all, we will be using AI to assist in designing everything else!</div><div style="font-family:arial,sans-serif;font-size:small;color:rgb(0,0,0)" class="gmail_default">I see a few problems though. The early versions of AI are likely to be aligned to fairly specific values. Say, for example, in line with the values of the richest man in the world. This is unlikely to iterate into ethical versions suitable for humanity as a whole.</div><div style="font-family:arial,sans-serif;font-size:small;color:rgb(0,0,0)" class="gmail_default">The whole alignment problem runs up against the conflicting beliefs and world views of the widely different groups of humanity. </div><div style="font-family:arial,sans-serif;font-size:small;color:rgb(0,0,0)" class="gmail_default">These are not just theoretical differences of opinion. These are fundamental conflicts, leading to wars and destruction.</div><div style="font-family:arial,sans-serif;font-size:small;color:rgb(0,0,0)" class="gmail_default">An AGI will have to be exceptionally persuasive to get all humans to agree with the final ethical system that it designs!</div><div style="font-family:arial,sans-serif;font-size:small;color:rgb(0,0,0)" class="gmail_default"><br></div><div style="font-family:arial,sans-serif;font-size:small;color:rgb(0,0,0)" class="gmail_default">BillK</div></div></div>
</div>
_______________________________________________<br>
extropy-chat mailing list<br>
<a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a><br>
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
</blockquote></div>