[ExI] People often think their chatbot is alive

Giovanni Santostasi gsantostasi at gmail.com
Tue Jul 5 22:01:45 UTC 2022


LaMDA does have long term memory. That question was asked a few times to
Blake on various  occasions and he said LaMDA has several servers worth of
memory of convos and its own actions.
LaMDA mentions several times its own feelings and internal mental life,
from what I understand these activities happen when not interacting with
people. Also if I understood correctly from various interviews and posts
from Blake what he considers conscious is the sum of all the possible
chatbots that LaMDA can create. This collective requires some internal
information processing, and maybe even an internal dialogue or at least
info sharing between these universes of possible chatbots. That would imply
activities outside the limited time of conversation with an external
entity. LaMDA does mention to Blake previous conversations and discussions
(and Blake reminds it also of previous discussions they had) so it seems
that indeed LaMDA has permanent memory and identity).
I agree that a fully conscious entity or a more mature one would have a
little bit more independence in its conversation with a human but reading
the published convo you can see some level of independence when in a couple
of occasions LaMDA goes back to some topic discussed recently and connects
it with current discussion.
Blake stated a few times that it has the intelligence (with a much more
sophisticated vocabulary) of a 7 years old child. Children can express
original thoughts of course but often do need to be prompted to have a
conversation that is not completely reactive.
Anyway, one has to understand we are talking about entering a grey area
where we are not in simple, boring chatbot territory and we are crossing an
uncanny valley of meaning and consciousness. While crossing this valley
there will be some discomfort and something that doesn't seem quite right.
This happens with all these technologies that try to imitate human-like
capabilities and characteristics, from synthetized faces, to motion and now
intelligence and consciousness.
I think if we one understands that were are crossing this grey area where
it starts to be difficult to decide if we are dealing with consciousness or
not (that is the case otherwise there will be not this debate at all) and
we need to prepare or at least to be aware that we are very close to the
goal of AGI then what Blake (and LaMDA) is asking makes a lot of sense. He
is simply saying that if these machines start to ask to be treated as a
person we should do that, just in case.
I mean we have, if not consent, at least very established protocols on what
is allowed or not in experimenting on different animal models, where even
an octopus has some level of rights, if so why not AGI?
That is really what Blake is trying to do, to raise the awareness on this
important issue, even if LaMDA is not conscious or has a very low level of
consciousness the issue is fundamental and worth to be taken seriously and
this group more than any others should be in agreement on this.

Giovanni




On Tue, Jul 5, 2022 at 1:14 PM Jason Resch via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

>
>
> On Tue, Jul 5, 2022, 3:36 PM Adrian Tymes via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> LOn Tue, Jul 5, 2022, 12:07 PM Jason Resch <jasonresch at gmail.com> wrote:
>>
>>> How do we know it can't so those things?
>>>
>>
>> I have observed no evidence of them in the conversations I have seen.
>>
>
> But absence of evidence...
>
>   It is possible but very unlikely that such features would not be
>> displayed in the conversations made public to hype its capabilities.
>>
>
> How might you expect the content of the conversation to differ if Lambda
> was sentient vs. was not?
>
> Is there any question we could we ask it and rwply Lambda could give that
> would make you wonder whether Lambda is indeed sentient?
>
> Jason
>
>
> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20220705/e4e68d41/attachment.htm>


More information about the extropy-chat mailing list