<div dir="ltr"><div dir="ltr">On Sun, Mar 26, 2023 at 1:38 AM Ben Zaiboc via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org">extropy-chat@lists.extropy.org</a>> wrote:<br></div><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
<div>
Jason Resch wrote:<br>
<br>
<blockquote type="cite"><b>3.
She was asked what a machine would have to do to convince her
they have understanding. Her example was that if Siri or Alexa
were asked to do something in the real world, like turn on the
lights, and if it does that, then it has understanding (by
virtue of having done something in the real world).</b></blockquote>
<br>
Wait a minute. So she thinks that smart home systems have
understanding of what they're doing, but LLMs don't? I wonder how
many Siris and Alexas are the voice interface for smart home
systems? A lot, I expect.<br>
<br>
If she's right (which she's not, seems to be the consensus here),
then all that needs to be done is link up a LLM to some smart home
hardware, and 'ta-daaaa', instant understanding!<br>
<br>
I don't buy it</div></blockquote><div><br>She called Alexa-like understanding a "kind of" understanding as in "sorta kinda," i.e., she brackets the word or puts it in scare-quotes. In so much as Alexa executes your command to turn off the lights, there is a sense in which it kind of "understands" your command. She is not referring to anything like conscious understanding of the meanings of words.<br><br>I also put the word in scare-quotes when I say with a straight face that my pocket calculator "understands" mathematics. I would not say it understands mathematics in the conventional common sense of the word.<br><br>-gts <br><br> <br><br> </div></div></div>