[ExI] Seemingly Conscious AI Is Coming

Brent Allsop brent.allsop at gmail.com
Wed Sep 17 16:28:27 UTC 2025


Of course you can engineer an AI system to lie, but I've never met one that
does.  They all tell you that they don't experience redness the
same phenomenal way that humans do.

And once we know which of all our descriptions of stuff in the brain it is
that has redness, if we don't see that engineered into an AI system, we
will know it is lying.


On Wed, Sep 17, 2025 at 10:24 AM Adrian Tymes via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> On Wed, Sep 17, 2025 at 12:19 PM Brent Allsop via extropy-chat
> <extropy-chat at lists.extropy.org> wrote:
> > The Turing test can't tell you if something is conscious, that is unless
> you ask it a question like: "What is redness like for you?"
>
> Why wouldn't an AI not have an answer to that, that is not at least as
> canned/not-conscious as the answers it gave to the other Turing test
> questions?
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20250917/bb742a39/attachment.htm>


More information about the extropy-chat mailing list