[ExI] Fwd: puzzle

William Flynn Wallace foozler83 at gmail.com
Thu May 4 18:44:46 UTC 2023


(I decided that you needed what the author wrote next)

from The Big PIcture, by Sean Carroll, astrophysicist:

(re an AI trying to pass the Turing test):  'It is easy to fool them, not
only because they don't have the kind of detailed contextual knowledge of
the outside world that any normal person would have, but because they don't
have memories even of the conversation they have been  having, much less
ways to integrate such memories into the rest of the discussion.'

In order to do so, they would have to have inner mental states that
depended on their entire histories in an integrated way, as well as the
ability to conjure up hypothetical future situations, all along
distinguishing the past from the future, themselves from their environment,
and reality from imagination.  As Turing suggested, a program that was
really good enough to convincingly sustain human-level interactions would
have to be actually thinking.

No memory?    bill w
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230504/ec0065c0/attachment.htm>


More information about the extropy-chat mailing list