[ExI] GPT-4 on its inability to solve the symbol grounding problem

Jason Resch jasonresch at gmail.com
Thu Apr 6 03:48:17 UTC 2023


On Wed, Apr 5, 2023, 11:26 PM Gordon Swobe <gordon.swobe at gmail.com> wrote:

> Frankly I am dumbfounded and flabbergasted that any intelligent person
> would question my statement  "Words mean things. In the absence of those
> things that they mean, they have no meanings."
>


"Words mean things" -- no disagreement here

"In the absence of the things they mean, they have no meaning" -- This I
disagree with. If two English speakers survived while the rest of the
universe disappeared completely, the two speakers could still carry on a
meaningful conversation. Their words would still mean things to them. As
long as there's a brain with an appropriate wiring to process words and
comprehend the network of relations each word has with other words, there
will be meaning. Meaning exists within the mind of the speaker, the
presence or absence of an external universe is irrelevant from the point of
view of the mind (which for all it knows could be dreaming, deluded, or in
a vat or sim).


Jason


> How do you all think you communicate here on ExI or IRL? You use words
> that mean things to you and which you expect will mean very similar things
> to others. The word-symbols that you write or utter are merely the vehicles
> for the meanings. Words without meanings are no more than, well,
> meaningless nonsense.
>
> -gts
>
>
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230405/920f4dca/attachment-0001.htm>


More information about the extropy-chat mailing list