[ExI] Language models are like mirrors

Jason Resch jasonresch at gmail.com
Sat Apr 1 21:56:35 UTC 2023

On Sat, Apr 1, 2023, 2:11 PM Gordon Swobe via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> On Sat, Apr 1, 2023 at 7:36 AM Ben Zaiboc via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>> On 01/04/2023 13:43, Gordon Swobe wrote:
>> Unlike these virtual LLMs, we have access also to the referents in the
>> world that give the words in language meaning.
>> I don't understand why this argument keeps recurring, despite having been
>> demolished more than once.
> I has not been demolished in my opinion and incidentally, as I’ve
> mentioned, my view is shared by the faculty director of the masters program
> in computational linguistics at the University of Washington. This is what
> she and her fellow professors teach. Many others understand things the same
> way. Brent points out that the majority of those who participate in his
> canonizer share similar views, including many experts in the field.
> I fail to see any significant difference between my brain and an LLM,
> On Exi, the computational model of mind is almost taken for granted.
> Consciously or consciously, almost everyone here believes their brain is,
> in essence, a digital computer.

It's not without some justification. Either the brain's behavior is
computable or it is not. And zombies are either possible or they are not.
If the brain's behavior is computable and zombies are impossible (there is
strong evidence supporting both these conclusions) then you arrive at the
computational theory of mind.

But this is only one of many models of mind, and one that I reject.

Is there one that you accept?

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230401/99dce94d/attachment.htm>

More information about the extropy-chat mailing list