[ExI] People often think their chatbot is alive

Ben Zaiboc ben at zaiboc.net
Fri Jul 15 20:21:47 UTC 2022


On 15/07/2022 20:02, DaveSill wrote:
> On Thu, Jul 14, 2022 at 5:28 AM Ben Zaiboc via extropy-chat 
> <extropy-chat at lists.extropy.org> wrote:
>
>     [...] They spontaneously (and endlessly!) ask questions.
>     They want to know "but why??" about every sodding thing. I remember
>     being like this myself, and driving my father up the wall. I
>     reckon that
>     when a machine starts exhibiting the same behaviour, we can be pretty
>     sure it is conscious, or at least well on the way.
>
>
> But an AI wouldn't have to ask a human to answer most of its 
> questions--it can research them and answer them by analyzing what it 
> already "knows" or what's available to it remotely. Asking questions 
> of elders is one way we learn but it's not the only way.
>
> -Dave

Ok, but why is that a 'But'? I'd think it would be an 'And'.

Or are you thinking we wouldn't know, or be able to tell, that that's 
what it was doing, so wouldn't be able to use that as an indication?

Ben
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20220715/b05285aa/attachment.htm>


More information about the extropy-chat mailing list