[ExI] Do all AI models represent “cat” in the same way?

Adrian Tymes atymes at gmail.com
Sat Jan 17 14:55:33 UTC 2026


On Sat, Jan 17, 2026 at 6:21 AM John Clark <johnkclark at gmail.com> wrote:
> On Fri, Jan 16, 2026 at 10:02 PM Adrian Tymes via extropy-chat <extropy-chat at lists.extropy.org> wrote:
>>> >> Me: And if the direction in multi dimensional idea space for the word "cat" is pointing in a specific direction (relative to other words) that is similar to the direction in multi dimensional idea space that a picture of a cat is pointing to (relative to other pictures) then they must have something in common. And the only thing that could be is that both the pictures and the words came from the same external reality;
>>
>> Incorrect.  There are other possible explanations.
>> For instance, the creators of both sets of training data may have had similar cultural inspirations - they "painted", whether with paint or with words, the same mental image.
>
> In that case I would've been correct because "both the pictures and the words came from the same external reality", and for an AI that only has access to such pictures or words, human interaction IS external reality.

Depends on the definition of "external reality".  I thought you meant
external to everyone involved, including the humans.

>> It could be that the AI is uncovering this mental model, suggesting that its training data does not have much representation from creators with substantially different mental models.
>
> Although some lawyers might object and some companies might deny it, the fact is that virtually all the words on the Internet are part of the training data of every modern AI, and even if we're restricted to just what occurs on this list there sure seems to be a lot of "substantially different mental models".

Nowhere near as different as those who do the objecting claim is out there.

The claim is that Chinese or Native American fundamental depictions of
things like "cat" are massively different from Western depictions, so
when AI keeps distilling to Western depictions, that's proof that said
other viewpoints are being excluded or not considered.

That's the claim, anyway.  From what I have seen of the "true"
cultures, many times the underlying concepts aren't actually all that
different, no matter how much they may feel different - or, the things
that are claimed as different amount to folklore, akin to saying that
we should give real credence to concepts such as the Fair Folk and
cryptobiology.  In extreme cases this is like saying we should
culturally honor flat earthers and homeopathic remedies.  But many of
the differences come down to different words for the same thing (not
surprising, given as the viewpoints originate in different languages
entirely), or different perspectives on the same concept.  This is
most pronounced in physical stuff that can be perceived; for instance,
whether in Europe, China, pre-European-contact North America, or
wherever, certain forms are cat-like and many other forms are not
cat-like.  There is more difference in social stuff, but convergence
on physical phenomena such as "this is a cat" does not imply
convergence on non-physical stuff.



More information about the extropy-chat mailing list