[ExI] Bender's Octopus (re: LLMs like ChatGPT)

William Flynn Wallace foozler83 at gmail.com
Sat Mar 25 17:02:22 UTC 2023


On Fri, Mar 24, 2023 at 2:16 PM Jason Resch via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

>
>
> On Fri, Mar 24, 2023, 3:00 PM William Flynn Wallace via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> Would anyone dare to give a definition of 'understanding'?
>> Please cite what epistemologies you are using.  bill w
>>
>
> The difficulty is that "understanding" is adjacent to knowledge and
> knowledge is adjacent to consciousness. All these are quite difficult to
> define but I will attempt by best:
>
> "Understanding" is knowledge concerning the relations or workings of
> something.  (just why do we just 'standing under' something to represent
> knowledge?)
>
> "Consciousness" is possession of knowledge. (since no creature has a
> blank mind, then all are conscious?)
>
> "Knowledge" is a true belief.  (true according to what epistemology?
> empiricism?  authorities?  intuition? reason?)
>
> "Belief" I have great difficulty defining, but I would say it is a mind
> state correlated with some proposition.  (I would say that it is
> something we think of as knowledge but not based on empiricism but rather
> on faith)
>
> "Truth" is undefinable, as proved by Tarski.  (again, true is something
> we accept according to our personal epistemology)
>
> I don't regard these as corrections, but just my ideas.   bill w
>
> I welcome any assistance or corrections to the above.
>
> Jason
>
>
>
>>
>> On Fri, Mar 24, 2023 at 1:40 PM Jason Resch via extropy-chat <
>> extropy-chat at lists.extropy.org> wrote:
>>
>>>
>>>
>>> On Fri, Mar 24, 2023 at 1:21 PM Gordon Swobe via extropy-chat <
>>> extropy-chat at lists.extropy.org> wrote:
>>>
>>>>
>>>>
>>>> On Fri, Mar 24, 2023 at 2:12 AM Stuart LaForge via extropy-chat <
>>>> extropy-chat at lists.extropy.org> wrote:
>>>>
>>>>
>>>>> But really the meaning of words are quite arbitrary and determined by
>>>>> the people who use them. Thus the referential meanings of words
>>>>> evolve
>>>>> and change over time and come to refer to different things
>>>>
>>>>
>>>> I agree this is a reason for many human miscommunications, but the
>>>> speaker understands his words to meaning *something* and the hearer
>>>> understands those words to mean *something*.
>>>>
>>>> As a computational linguist, Bender is on our side.  She is obviously
>>>> very excited about the progress these language models represent, but is
>>>> reminding that the models do not actually understand words to mean anything
>>>> whatsoever.
>>>>
>>>>
>>>
>>> What's her evidence of that?
>>>
>>> Jason
>>> _______________________________________________
>>> extropy-chat mailing list
>>> extropy-chat at lists.extropy.org
>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>
>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230325/f7ae62e2/attachment.htm>


More information about the extropy-chat mailing list