<html><head><meta http-equiv="content-type" content="text/html; charset=utf-8"></head><body style="overflow-wrap: break-word; -webkit-nbsp-mode: space; line-break: after-white-space;"><div>If a flower can attract a male insect by presenting the facsimile of a female insect, it shows that both the flower and the insect have evolved to do what they do; the flower, like the insect, has a certain level of "intelligence" but it is not an intelligence anything like that of the insect, because the reward system that it evolved in is nothing like that of an actual female insect.</div><div><br></div><div>The fact that we have created the facsimile of human intelligence in no way makes it anything like human intelligence. It could be some other kind of intelligence.</div><div><br></div><div>Tara Maya</div><div><br></div><br><div><br><blockquote type="cite"><div>On Mar 18, 2023, at 3:29 PM, Darin Sunley via extropy-chat <extropy-chat@lists.extropy.org> wrote:</div><br class="Apple-interchange-newline"><div><div dir="ltr">If a system can formulate a sentence that employs a word, <div>and I can then ask the system to tell me the definition of that word, </div><div>and it does, </div><div>and I can then ask the system to relate its definition to the original utterance, </div><div>and it does, </div><div>and the system goes on to continue to use that word,</div><div>and all of the system's uses of that word are consistent with the definition,</div><div>then I no longer even understand what it could possibly mean for that system to /not/ "really understand" that word.</div><div><br></div><div>You might just as well say humans don't "really understand" words, since all our neurons are doing is manipulating concentrations of neurotransmitters and calcium ions.</div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Sat, Mar 18, 2023 at 10:15 AM William Flynn Wallace via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div class="gmail_default" style="font-family: "comic sans ms", sans-serif; font-size: large;">Is there any referent for the word 'understand' as it relates to an AI? bill w</div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Sat, Mar 18, 2023 at 4:42 AM Gordon Swobe via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">I think those who think LLM AIs like ChatGPT are becoming conscious or sentient like humans fail to understand a very important point: these software applications only predict language. They are very good at predicting which word should come next in a sentence or question, but they have no idea what the words mean. They do not and cannot understand what the words refer to. In linguistic terms, they lack referents.<br><br>Maybe you all already understand this, or maybe you have some reasons why I am wrong.<div><br></div><div>-gts</div></div>
_______________________________________________<br>
extropy-chat mailing list<br>
<a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a><br>
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
</blockquote></div>
_______________________________________________<br>
extropy-chat mailing list<br>
<a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a><br>
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
</blockquote></div>
_______________________________________________<br>extropy-chat mailing list<br>extropy-chat@lists.extropy.org<br>http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat<br></div></blockquote></div><br></body></html>