[ExI] Bender's Octopus (re: LLMs like ChatGPT)

Brent Allsop brent.allsop at gmail.com
Mon Mar 27 02:08:27 UTC 2023


On Sun, Mar 26, 2023 at 4:35 PM Gordon Swobe via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> On Sat, Mar 25, 2023 at 2:42 AM Ben Zaiboc via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> Reading these conversations over the last few days, it has struck me
>> that some people keep referring to 'real' things, usually using the word
>> 'referents' (e.g. an apple), as though our brains had direct access to
>> them and could somehow just know what they are.
>>
>> But we don't.
>>
>> Think about it, what is "An Apple"?
>> ...
>> There is no spoon! Er, Apple. There is no Apple!
>> Not as a 'real-world thing'.
>>
>
> It would seem that you would rather say that apples are not real than say
> that the word "apple" has meaning.
>

I don't believe he is saying that at all.  Are YOU saying, we don't have
knowledge of an apple, which has very real redness and greenness qualities,
which can be computationally bound, which is what we know about the
apple??  Redness is a quality of our knowledge of the apple.  THAT very
real physical quality of our knowledge, in the brain, is the referent of
redness.  We don't know the colorness qualities of the apple, or anything
else out there, since our brains false colors all of our knowledge, so it
can emphasize, in our understanding of the apple, what is important to us,
as part of our computation process of needing to pick the apple.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230326/3fd1ef31/attachment.htm>


More information about the extropy-chat mailing list