<div dir="ltr">I don't know what causes phenomenal conscious experience in humans, but since nothing about a human's phenomenal conscious experience seems to have anything at all to do with the physical properties of glutamate, calcium ions, or lipid membranes, I don't see any reason why a computer's phenomenal conscious experience, if any, wouldn't also have nothing to do with the physical properties of electrons, semiconductors, voltage differences, or copper wiring.<div><br></div><div>To paraphrase Randall Munroe, analogous absences of correlations don't imply analogous absences of causation, but they do waggle their eyebrows suggestively and gesture furtively while mouthing ‘look over there.’ </div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, Apr 27, 2023 at 6:06 PM spike jones via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div class="msg860920641360065887"><div lang="EN-US" style="overflow-wrap: break-word;"><div class="m_860920641360065887WordSection1"><p class="MsoNormal"><u></u> <u></u></p><p class="MsoNormal"><u></u> <u></u></p><div style="border-right:none;border-bottom:none;border-left:none;border-top:1pt solid rgb(225,225,225);padding:3pt 0in 0in"><p class="MsoNormal"><b>…</b>> <b>On Behalf Of </b>Gordon Swobe via extropy-chat<br><b>Subject:</b> Re: [ExI] Ben Goertzel on Large Language Models<u></u><u></u></p></div><p class="MsoNormal"><u></u> <u></u></p><div><div><p class="MsoNormal">On Thu, Apr 27, 2023 at 4:59 PM Giovanni Santostasi <<a href="mailto:gsantostasi@gmail.com" target="_blank">gsantostasi@gmail.com</a>> wrote:<u></u><u></u></p></div><div><blockquote style="border-top:none;border-right:none;border-bottom:none;border-left:1pt solid rgb(204,204,204);padding:0in 0in 0in 6pt;margin-left:4.8pt;margin-right:0in"><div><p class="MsoNormal" style="margin-left:11.55pt">>>…Gordon,<br>Given Goertzel believes that we can reach AGI in a few years would you simply concede that when we reach this level of intelligence the AGI would be conscious if it behaves like a conscious agent …<br><br>>…As for whether any AI will have subjective experience -- what I mean by consciousness -- I do doubt that, at least on digital computers as we understand them today. I certainly do not believe that GPT-4 or any other LLM is conscious.<br><br>-gts<u></u><u></u></p></div></blockquote><div><p class="MsoNormal"><u></u> <u></u></p><p class="MsoNormal"><u></u> <u></u></p><p class="MsoNormal"><u></u> <u></u></p><p class="MsoNormal"><u></u> <u></u></p><p class="MsoNormal">Seems we are working back to a question I have posed earlier: are consciousness and intelligence separable? In principle, I don’t see why not. ChatGPT is claiming to be not conscious, but it appears to be intelligent.<u></u><u></u></p><p class="MsoNormal"><u></u> <u></u></p><p class="MsoNormal">I suppose we could ask GPT if it thinks consciousness and intelligence can be separated, but it might end up contradicting itself. Perhaps someone already did that experiment.<u></u><u></u></p><p class="MsoNormal"><u></u> <u></u></p><p class="MsoNormal">spike<u></u><u></u></p><p class="MsoNormal"><u></u> <u></u></p><p class="MsoNormal"><u></u> <u></u></p></div></div></div></div></div>_______________________________________________<br>
extropy-chat mailing list<br>
<a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a><br>
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
</div></blockquote></div>