[ExI] Bender's Octopus (re: LLMs like ChatGPT)

BillK pharos at gmail.com
Fri Mar 24 19:37:09 UTC 2023


On Fri, 24 Mar 2023 at 19:17, Jason Resch via extropy-chat
<extropy-chat at lists.extropy.org> wrote:
>
> The difficulty is that "understanding" is adjacent to knowledge and knowledge is adjacent to consciousness. All these are quite difficult to define but I will attempt by best:
>
> "Understanding" is knowledge concerning the relations or workings of something.
> "Consciousness" is possession of knowledge.
> "Knowledge" is a true belief.
> "Belief" I have great difficulty defining, but I would say it is a mind state correlated with some proposition.
> "Truth" is undefinable, as proved by Tarski.
>
> I welcome any assistance or corrections to the above.
>
> Jason
> _______________________________________________


Wikipedia has a useful article.
<https://en.wikipedia.org/wiki/Understanding>

It points out that there are levels of understanding.
And an AI can pretend to understand by memorising facts plus a few rules.
Quote:
It is possible for a person, or a piece of "intelligent" software,
that in reality only has a shallow understanding of a topic, to appear
to have a deeper understanding than they actually do, when the right
questions are asked of it. The most obvious way this can happen is by
memorization of correct answers to known questions, but there are
other, more subtle ways that a person or computer can (intentionally
or otherwise) deceive somebody about their level of understanding,
too. This is particularly a risk with artificial intelligence, in
which the ability of a piece of artificial intelligence software to
very quickly try out millions of possibilities (attempted solutions,
theories, etc.) could create a misleading impression of the real depth
of its understanding. Supposed AI software could in fact come up with
impressive answers to questions that were difficult for unaided humans
to answer, without really understanding the concepts at all, simply by
dumbly applying rules very quickly.
-------------------

BillK


More information about the extropy-chat mailing list