<div dir="auto"><div class="gmail_quote" dir="auto"><div dir="ltr" class="gmail_attr">On Mon, Jul 24, 2023, 10:23 AM Brent Allsop via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org">extropy-chat@lists.extropy.org</a>> wrote:</div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div>So all sufficiently intelligent systems must realize this</div></div></blockquote></div><div dir="auto"><br></div><div dir="auto">It is generally a fallacy to assume that any specific realization will happen, pretty much ever.</div><div dir="auto"><br></div><div dir="auto">Direct, clear explanations can get concepts across, but relying on individuals or groups to realize things without them being pointed out has a significant failure rate.</div><div class="gmail_quote" dir="auto"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
</blockquote></div></div>