[ExI] AI Hallucinations

Darin Sunley dsunley at gmail.com
Mon Apr 29 14:28:12 UTC 2024


The very first time I tried to use ChatGPT, it was to try to find a book
from my childhood whose title, "Comparisons" made it very difficult to
Google.

The AI gave me highly specific answers about the book, it's authors, and
publishers that were plausible, confident, and completely made up.

On Mon, Apr 29, 2024, 8:16 AM BillK via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> On Sat, 27 Apr 2024 at 16:49, Kelly Anderson via extropy-chat
> <extropy-chat at lists.extropy.org> wrote:
> >
> > In our conversations regarding Tikopia, and Jared Diamond in general,
> > it seems that we are running into a fairly large number of AI
> > hallucinations. Bill said "llama-3 quotes book and page number
> > references, so it may be more reliable." This particular form of
> > hallucination is, unfortunately, at this time extremely common for
> > LLMs. In many cases, I have found such references to refer to
> > articles, books and scientific journals that sometimes DON'T EVEN
> > EXIST. When an LLM says something, it wants to sound authoritative.
> > Just as Jared Diamond wanted to sound authoritative when writing his
> > books. What is happening in both cases is similar to the six year old
> > know-it-all who will make up stuff to win an argument. Personally, I
> > went through a period earlier in life where I repeated things I'd
> > heard as if they were true. While not actually intentionally lying, I
> > gave credence to things I shouldn't have without further research. I
> > believe that AIs are at this stage of development. So, I would caution
> > all of us to actually make sure referenced materials actually exist
> > when referred to by the know-it-all AIs of our day. I am encouraged by
> > work going on in the AI community to double check such statements
> > coming out of LLMs, and I'm quite certain that this is a short term
> > problem. Nevertheless, it's a big concern for the current AI models.
> >
> > This leads me to another semi-random thought... We've always tried to
> > maintain a history of what we've done as humans over the millenia. We
> > have backups, but they become increasingly difficult to retrieve after
> > time. This seems like a particularly difficult thing to do with
> > today's generative AIs... I don't think future researchers will be
> > able to interact with our LLMs very easily since they are cloud based
> > and expensive to maintain just for futuristic anthropologists and
> > computer scientists. While I don't think this is an existential threat
> > or anything, it is interesting, to me at least. LOL
> >
> > -Kelly
> > _______________________________________________
>
>
> Yes, I have read of a case where lawyers that used LLMs in preparation
> ended up quoting fictitious cases and decisions to the judge.
>
> I recommend using a search engine like DuckDuckGo to attempt to find
> evidence to support claims made by an LLM.
>
> Re the Tikopia discussion, I found some support for the LLMs claims.
> For example -
> <
> https://ethicsofsuicide.lib.utah.edu/tradition/indigenous-cultures/oceanic-cultures/solomon-islands4/
> >
> SOLOMON ISLANDS
> #4 Tikopian Attitudes Towards Suicide
>      (Raymond Firth, 1967)
>
> and
> <
> https://www.researchgate.net/publication/350101339_%27The_Natives_Freely_Spoke_of_the_Custom%27_Sex-Selective_Infanticide_and_Maori_Depopulation_1815-58
> >
>
> ‘The Natives Freely Spoke of the Custom’: Sex-Selective Infanticide
> and Māori Depopulation, 1815–58
> March 2021      The Journal of Pacific History 56(1):1-24
> DOI:10.1080/00223344.2021.1882838
> Authors:   Simon Chapple.
> -------------------
>
> BillK
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20240429/3b558a18/attachment.htm>


More information about the extropy-chat mailing list