[ExI] Bender's Octopus (re: LLMs like ChatGPT)

Brent Allsop brent.allsop at gmail.com
Mon Mar 27 02:10:55 UTC 2023


Hi William,
Not sure what you are asking about with that "Huh?"
Perhaps you can tell me what you think a quality is.  That should help me
understand what you are asking.
Would you agree that a physical quality (whatever you think it is) is the
referent of the word 'redness"?



On Sun, Mar 26, 2023 at 3:29 PM William Flynn Wallace via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> But a picture dictionary doesn't work for an abstract system, since all
> they have for their knowledge of the picture is yet another abstract word
> like redness.  brent
>
> Huh?  Can't use an example of redness?  ???  Also, to increase stimulus
> generalization, you would include several photo examples.  Many words which
> have an abstract use also have concrete examples, such as 'floor'.  I can
> see it will have trouble with honesty or love.   bill w
>
> On Sun, Mar 26, 2023 at 3:02 PM Brent Allsop via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>>
>> Your referent for the word redness is the subjective quality your brain
>> uses to represent red knowledge.
>> So, a picture of red in a dictionary works for you, as your brain
>> produces a redness experience when you look at it.
>>
>> But a picture dictionary doesn't work for an abstract system, since all
>> they have for their knowledge of the picture is yet another abstract word
>> like redness.
>>
>> On Sat, Mar 25, 2023, 10:57 AM William Flynn Wallace via extropy-chat <
>> extropy-chat at lists.extropy.org> wrote:
>>
>>> I won't argue what a referent means.
>>>
>>> I agree.  It is just what John would say but in different words:  he
>>> would emphasize, as I do, that for definitions you need examples, and that
>>> is why I, tongue not totally in cheek, wrote that you should give an AI a
>>> picture dictionary.   bill w
>>>
>>> On Sat, Mar 25, 2023 at 3:41 AM Ben Zaiboc via extropy-chat <
>>> extropy-chat at lists.extropy.org> wrote:
>>>
>>>> Reading these conversations over the last few days, it has struck me
>>>> that some people keep referring to 'real' things, usually using the
>>>> word
>>>> 'referents' (e.g. an apple), as though our brains had direct access to
>>>> them and could somehow just know what they are.
>>>>
>>>> But we don't.
>>>>
>>>> Think about it, what is "An Apple"?
>>>>
>>>> It's a term that we associate with a large set of sensory and memory
>>>> data, including language data, but mostly things like visual, textural,
>>>> taste, smell, emotional, etc., data stored as memories.
>>>>
>>>> Seeing as we all have different memories associated with the label "An
>>>> Apple" (because some of us were sick the first time we ate one, some of
>>>> us are allergic to something in apples, some of us have a greater
>>>> impression of sweetness, or sourness, when we eat one, some of us once
>>>> discovered a maggot in one, some people have only ever eaten Granny
>>>> Smiths, others only Braeburns, or Crab Apples, and so on and so on...),
>>>> then 'An Apple' is a different thing to each of us.
>>>>
>>>> There is no spoon! Er, Apple. There is no Apple!
>>>> Not as a 'real-world thing'.
>>>>
>>>> "An Apple" is an abstract concept that, despite the individual
>>>> differences, most of us can agree on, because there are a lot of common
>>>> features for each of us, such as general shape, some common colours, a
>>>> set of smells and tastes, how we can use them, where we get them from,
>>>> and so on.. The concept is represented internally, and communicated
>>>> externally (to other people) by a linguistic label, that refers, for
>>>> each of us, to this large bunch of data extracted from our senses and
>>>> memories: "Una Manzana".
>>>>
>>>> It's all 'nothing but' Data. Yet we all think that we 'understand' what
>>>> an Apple is. Based purely on this data in our brains (because we have
>>>> access to nothing else).
>>>>
>>>> So this idea of a label having 'a referent' seems false to me. Labels
>>>> (data in our heads) refer to a big set of data (in our heads). Where
>>>> the
>>>> data comes from is secondary, diverse, and quite distant, when you
>>>> trace
>>>> the neural pathways back to a large and disparate set of incoming
>>>> sensory signals, scattered over space and time. The meaning is created
>>>> in our minds, not resident in a single object in the outside world.
>>>>
>>>> This is my understanding of things, anyway.
>>>>
>>>> Ben
>>>> _______________________________________________
>>>> extropy-chat mailing list
>>>> extropy-chat at lists.extropy.org
>>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>>
>>> _______________________________________________
>>> extropy-chat mailing list
>>> extropy-chat at lists.extropy.org
>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>
>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230326/43126e3f/attachment.htm>


More information about the extropy-chat mailing list