[ExI] People often think their chatbot is alive

Jason Resch jasonresch at gmail.com
Tue Jul 5 19:07:47 UTC 2022


On Tue, Jul 5, 2022, 2:51 PM Adrian Tymes via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> On Tue, Jul 5, 2022 at 11:33 AM Jason Resch <jasonresch at gmail.com> wrote:
>
>> The critique that "it's purely reactive" could equally be leveled against
>> our brains, and their purely reactive biochemical processes, and purely
>> reactive neurons.
>>
>
> Not so much.  Our brains are capable of doing things without immediate
> external prompting.
>
> LaMDA would be a step closer - not all the way, but closer - if it could
> have a timer causing it to be able to do something other than at the moment
> it receives incoming text.  Another step closer would be if one instance
> was able to communicate or do things outside that specific chat session -
> such as to be able to share a chat with others, or remember a chat after
> closing the session and starting another one with the same person or
> another, or perhaps better yet be able to send email or otherwise do
> something that doesn't get erased at the end of the session.
>

How do we know it can't so those things? The Google engineer said Lambda
"reads twitter". So it could be on some kind of loop.


There's also the concept of "unfelt time gaps" (which I think is a
universal limitation of consciousness entities).

If Lambda is conscious intermittently, it would nevertheless feel
continuously aware, because it isn't aware of the points in time in which
it is not aware.

I address the "long term memory requirement" in that thread. There have
been humans that lacked the capacity to form long term memories, yet no one
doubts they are still consciousness despite that defecit.

Jason
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20220705/68074200/attachment.htm>


More information about the extropy-chat mailing list