[ExI] More thoughts on sentient computers

Dave S snapbag at proton.me
Mon Feb 20 16:44:25 UTC 2023


On Monday, February 20th, 2023 at 11:08 AM, spike jones via extropy-chat <extropy-chat at lists.extropy.org> wrote:

>>
>
> The Turing test indicates only one class of sentience (he said, as we now have software passing the Turing test regularly (demonstrating our collective desperate determination to move the AI goal posts again.))

There's no "the" Turing Test. Even Turing described at least two variations. And they weren't defined sufficiently to be practical tests--and likely weren't intended to be used that way. The Wikipedia page <https://en.wikipedia.org/wiki/Turing_test> covers all of the variations and problems with implementation pretty well.

In my opinion, a useful variation would be for an examiner to have two rate-limited text chat sessions active: one with an AI and one with a human, neither of which would see the dialogue in the other chat session. The AI would be instructed to pretend to be a human. The examiner would attempt to determine which subject is human and which is AI. There'd have to be other restrictions because things like writing a haiku or limerick are easy for, e.g., ChatGPT, but many humans would have a hard time with that. A sufficient rate limit could fix that but would slow down the test too much.

-Dave
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230220/aab27141/attachment.htm>


More information about the extropy-chat mailing list