[ExI] AI Hallucinations

Kelly Anderson postmowoods at gmail.com
Sat Apr 27 15:45:46 UTC 2024


In our conversations regarding Tikopia, and Jared Diamond in general,
it seems that we are running into a fairly large number of AI
hallucinations. Bill said "llama-3 quotes book and page number
references, so it may be more reliable." This particular form of
hallucination is, unfortunately, at this time extremely common for
LLMs. In many cases, I have found such references to refer to
articles, books and scientific journals that sometimes DON'T EVEN
EXIST. When an LLM says something, it wants to sound authoritative.
Just as Jared Diamond wanted to sound authoritative when writing his
books. What is happening in both cases is similar to the six year old
know-it-all who will make up stuff to win an argument. Personally, I
went through a period earlier in life where I repeated things I'd
heard as if they were true. While not actually intentionally lying, I
gave credence to things I shouldn't have without further research. I
believe that AIs are at this stage of development. So, I would caution
all of us to actually make sure referenced materials actually exist
when referred to by the know-it-all AIs of our day. I am encouraged by
work going on in the AI community to double check such statements
coming out of LLMs, and I'm quite certain that this is a short term
problem. Nevertheless, it's a big concern for the current AI models.

This leads me to another semi-random thought... We've always tried to
maintain a history of what we've done as humans over the millenia. We
have backups, but they become increasingly difficult to retrieve after
time. This seems like a particularly difficult thing to do with
today's generative AIs... I don't think future researchers will be
able to interact with our LLMs very easily since they are cloud based
and expensive to maintain just for futuristic anthropologists and
computer scientists. While I don't think this is an existential threat
or anything, it is interesting, to me at least. LOL

-Kelly


More information about the extropy-chat mailing list