[ExI] Bender's Octopus (re: LLMs like ChatGPT)

Gordon Swobe gordon.swobe at gmail.com
Fri Mar 24 07:58:07 UTC 2023


On Fri, Mar 24, 2023 at 1:22 AM Giovanni Santostasi <gsantostasi at gmail.com>
wrote:

> Everything is simulated. Our entire mental life is simulated. The brain is
> making up the world from limited information and create models all the
> time. We would not be able not just understand but even see or experience
> anything if we didn't SIMULATE the world around us. We do it all the time.
> We confabulate all the time, like NLMs exactly do.
> They work because they do EXACTLY what we do.
>

If everything is simulated, why did GPT make the point that it can
only simulate understanding? Because it is true.

But actually I agree with Jason that we can't trust these models to tell us
the truth about these matters. Their entire purpose and function is to
simulate human speech while appearing not to simulate it.

Also I am referring to digital simulations, not simulations per se.

-gts
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230324/eb29e35e/attachment-0001.htm>


More information about the extropy-chat mailing list