[ExI] Language models are like mirrors
Gordon Swobe
gordon.swobe at gmail.com
Sun Apr 2 04:41:33 UTC 2023
I wrote
They are trained on massive amounts of written material, much of it being
> conscious humans in conversation, and so they mimic conscious humans in
> conversation.
More accurately, they are trained on massive amounts of text much of it
written in the first person. This includes both fictional as well as
nonfictional material. Is it so surprising then they can write persuasively
in the first person and appear conscious? But they are conscious only in
the same sense that a fictional character in a novel written in the first
person is conscious.
-gts
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230401/a2b45fca/attachment.htm>
More information about the extropy-chat
mailing list