[ExI] puzzle

Giovanni Santostasi gsantostasi at gmail.com
Fri May 5 03:48:51 UTC 2023


Yeah,
Nothing written about LLMs before this year has any value given that they
experienced a phase shift in their performance. I have shown graphs where
this is demonstrated by a sudden jump in their performance in several math
and cognitive tasks demonstrating emergent properties.
GPT-4 has a very good memory of current and even past convo when you keep
the convo in the same instance.
Giovanni

On Thu, May 4, 2023 at 5:33 PM Gadersd via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> (re an AI trying to pass the Turing test):  'It is easy to fool them, not
> only because they don't have the kind of detailed contextual knowledge of
> the outside world that any normal person would have, but because they don't
> have memories even of the conversation they have been  having, much less
> ways to integrate such memories into the rest of the discussion.'
>
> No memory?    bill w
>
>
> I think Carroll is referring to traditional chat bots that were unable to
> effectively utilize information scattered all throughout the conversation
> in formulating responses. Simple bots, Markov chain bots for example, only
> base the next word generated on the past few words. Utilizing all available
> information in formulating a response requires a much more sophisticated
> model such as the much hyped transformer based large language models.
>
> On May 4, 2023, at 2:44 PM, William Flynn Wallace via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>
> (I decided that you needed what the author wrote next)
>
> from The Big PIcture, by Sean Carroll, astrophysicist:
>
> (re an AI trying to pass the Turing test):  'It is easy to fool them, not
> only because they don't have the kind of detailed contextual knowledge of
> the outside world that any normal person would have, but because they don't
> have memories even of the conversation they have been  having, much less
> ways to integrate such memories into the rest of the discussion.'
>
> In order to do so, they would have to have inner mental states that
> depended on their entire histories in an integrated way, as well as the
> ability to conjure up hypothetical future situations, all along
> distinguishing the past from the future, themselves from their environment,
> and reality from imagination.  As Turing suggested, a program that was
> really good enough to convincingly sustain human-level interactions would
> have to be actually thinking.
>
> No memory?    bill w
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230504/af2f2224/attachment.htm>


More information about the extropy-chat mailing list