<div dir="auto"><div><br><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, Mar 23, 2023, 6:21 PM spike jones via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div lang="EN-US" link="blue" vlink="purple" style="word-wrap:break-word"><div class="m_-1310333575552725565WordSection1"><p class="MsoNormal"><u></u> <u></u></p><p class="MsoNormal"><u></u> <u></u></p><div style="border:none;border-top:solid #e1e1e1 1.0pt;padding:3.0pt 0in 0in 0in"><p class="MsoNormal"><b>From:</b> extropy-chat <<a href="mailto:extropy-chat-bounces@lists.extropy.org" target="_blank" rel="noreferrer">extropy-chat-bounces@lists.extropy.org</a>> <b>On Behalf Of </b>Adrian Tymes via extropy-chat<br><b>Subject:</b> Re: [ExI] Bender's Octopus (re: LLMs like ChatGPT)<u></u><u></u></p></div><p class="MsoNormal"><u></u> <u></u></p><div><div><div><div><p class="MsoNormal">On Thu, Mar 23, 2023, 12:56 PM Stuart LaForge via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org" target="_blank" rel="noreferrer">extropy-chat@lists.extropy.org</a>> wrote:<u></u><u></u></p></div><blockquote style="border:none;border-left:solid #cccccc 1.0pt;padding:0in 0in 0in 6.0pt;margin-left:4.8pt;margin-right:0in"><p class="MsoNormal">I posed this exact question to ChatGPT <u></u><u></u></p></blockquote></div></div><div><p class="MsoNormal"><u></u> <u></u></p></div><div><p class="MsoNormal">>…ChatGPT has references for what bears and sticks are…<u></u><u></u></p><p class="MsoNormal"><u></u> <u></u></p><p class="MsoNormal">Ja, there was something kinda cool about the exchange. ChatGPT was told “…I am being attacked by an angry bear…”<u></u><u></u></p><p class="MsoNormal"><u></u> <u></u></p><p class="MsoNormal">It somehow understood that the interlocutor was not at that moment in the process of being devoured while pecking away on his computer for advice on a survival strategy (the subject of my silly riff.) It understood it was being asked about a theoretical situation rather than what it was literally told.<u></u><u></u></p><p class="MsoNormal"><u></u> <u></u></p><p class="MsoNormal">That kinda implies a form of understanding, or specifically: a very effective use of language models to create the illusion of understanding.<u></u><u></u></p><p class="MsoNormal"><u></u> <u></u></p><p class="MsoNormal">I really don’t think it thinks, but it makes us think it thinks. ChatGPT is wicked cool.</p></div></div></div></div></blockquote></div></div><div dir="auto"><br></div><div dir="auto">Is there such a thing as "simulated multiplication" or would we say simulated multiplication is the same thing as real multiplication?</div><div dir="auto"><br></div><div dir="auto">Is there such a thing as "simulated thinking"?</div><div dir="auto"><br></div><div dir="auto">Jason</div><div dir="auto"><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div lang="EN-US" link="blue" vlink="purple" style="word-wrap:break-word"><div class="m_-1310333575552725565WordSection1"><div><div><p class="MsoNormal"><u></u></p><p class="MsoNormal"><u></u></p></div></div></div></div>
</blockquote></div></div></div>