<div dir="auto"><div class="gmail_quote" dir="auto"><div dir="ltr" class="gmail_attr">On Wed, Dec 11, 2024, 6:29 AM Adrian Tymes via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div dir="ltr"> They just do what they do.) Since only partnering with a human achieves this level of execution, the only AIs that are able to self-improve substantially are ones that essentially incorporate a human, who gets improved as part of the process. The Singularity, or something like it, thus happens with upgraded human minds, so the resulting superintelligences see themselves as human and care about humanity.</div><div class="gmail_quote"><div><br></div><div>Will those do for a start?</div></div></div></blockquote></div><div dir="auto"><br></div><div dir="auto">Absolutely yes. </div><div dir="auto"><br></div><div dir="auto">I especially like the AI cooperating with human desire for mutual benefit. </div><div dir="auto"><br></div><div dir="auto">It will be interesting to reflect on the feelings of isolation vs connection once we are on the other side. I know Humanity was depicted as fighting against Borg in the Star Trek universe. I found it ironic that they/we only succeeded against the threat of subsumption into the Borg collective by embracing the idealogy of a Humanity collective.</div><div dir="auto"><br></div><div dir="auto">Thanks!</div><div class="gmail_quote" dir="auto"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
</blockquote></div></div>