[ExI] Emily M. Bender — Language Models and Linguistics (video interview)

Ben Zaiboc ben at zaiboc.net
Mon Mar 27 11:30:06 UTC 2023


On 27/03/2023 02:30, Gordon Swobe wrote:
> On Sun, Mar 26, 2023 at 1:38 AM Ben Zaiboc via extropy-chat 
> <extropy-chat at lists.extropy.org> wrote:
>
>     Jason Resch wrote:
>
>>     *3. She was asked what a machine would have to do to convince her
>>     they have understanding. Her example was that if Siri or Alexa
>>     were asked to do something in the real world, like turn on the
>>     lights, and if it does that, then it has understanding (by virtue
>>     of having done something in the real world).*
>
>     Wait a minute. So she thinks that smart home systems have
>     understanding of what they're doing, but LLMs don't? I wonder how
>     many Siris and Alexas are the voice interface for smart home
>     systems? A lot, I expect.
>
>     If she's right (which she's not, seems to be the consensus here),
>     then all that needs to be done is link up a LLM to some smart home
>     hardware, and 'ta-daaaa', instant understanding!
>
>     I don't buy it
>
>
> She called Alexa-like understanding a "kind of" understanding as in 
> "sorta kinda," i.e., she brackets the word or puts it in scare-quotes. 
> In so much as Alexa executes your command to turn off the lights, 
> there is a sense in which it kind of "understands" your command. She 
> is not referring to anything like conscious understanding of the 
> meanings of words.
>
> I also put the word in scare-quotes when I say with a straight face 
> that my pocket calculator "understands" mathematics. I would not say 
> it understands mathematics in the conventional common sense of the word.


Ok, so seeing as there's no real difference between sending a text 
message "turn on the light" and sending a signal to a smart home system 
to turn on the light, and similarly no difference between receiving a 
message saying "ok, lights are on", and getting a positive signal from a 
light-sensor, we can conclude that they /do/ have a 'kind of' 
understanding, even if they're not conscious.

Glad we cleared that up!

Ben
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230327/4ab1f10c/attachment-0001.htm>


More information about the extropy-chat mailing list