[ExI] Bender's Octopus (re: LLMs like ChatGPT)

Stuart LaForge avant at sollegro.com
Sun Mar 26 22:45:40 UTC 2023


Quoting Tara Maya via extropy-chat <extropy-chat at lists.extropy.org>:

>> On Mar 24, 2023, at 1:11 AM, Stuart LaForge via extropy-chat  
>> <extropy-chat at lists.extropy.org> wrote:
>>
>> But really the meaning of words are quite arbitrary and determined  
>> by the people who use them. Thus the referential meanings of words  
>> evolve and change over time and come to refer to different things.
>
>
> The meaning of words can only change over time if the referents that  
> they indicate change.
>
> That does not make words arbitrary, but in fact, shows how important  
> referents are for real language.

I never said referents are not important to meaning, but they are  
relative to time, place, and culture. By arbitrary, I meant variable  
not trivial. They are important, but they don't need to correspond to  
things that are physical or even real. Surely you see can see how an  
LLM has the ability to understand fairies and dragons as well as any  
human does because neither has ever directly experienced a fairy or  
dragon, your tall tales not withstanding. ;)

The referents of a sizable number of words are as abstract as the  
words themselves are, thus they need no phenomenal experience of to  
understand. Think of it as the difference between book-knowledge and  
street-smarts.

Stuart LaForge






More information about the extropy-chat mailing list