[ExI] Zombies

Jason Resch jasonresch at gmail.com
Sun Apr 30 23:20:13 UTC 2023


On Sun, Apr 30, 2023, 1:08 AM Gordon Swobe <gordon.swobe at gmail.com> wrote:

>
> On Sat, Apr 29, 2023 at 4:09 PM Jason Resch via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>>
>>> By the way, Jason, you were saying that the models at character.ai
>>> still claim to be conscious. I went there and found that not to be the
>>> case. Perhaps you can show me what you meant.
>>>
>>
>> https://photos.app.goo.gl/2R4fHkAyjyHHWTU88
>>
>> And:
>>
>> https://photos.app.goo.gl/osskvbe4fYpbK5uZ9
>>
>
> Is this version of LaMDA online? I'm looking for live examples of LLMs
> that (still) claim consciousness. I thought you wrote that they exist at
> character.ai, but the one I tried there gave answers similar to GPT to
> the relevant questions.
>
> I want to interrogate one. :)
>

It was this one:

https://beta.character.ai/chat?char=Qu8qKq7ET9aO-ujfPWCsNoIilVabocasi-Erp-pNlcc


>
> I'm not sure if you saw my response to the conversation you had with LaMDA
> about its understanding of word meanings. Its positive response was
> nonsensical. LaMDA claimed that its creators provided it with a database of
> word meanings, which is impossible. Any such database would consist of
> written word *definitions*, leading to an endless cycle of looking up
> definitions for the words within those definitions, and so on in
> an endless search for meaning.
>

We've debated this extensively.

Jason
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230430/c3babfee/attachment.htm>


More information about the extropy-chat mailing list