[ExI] Ben Goertzel on Large Language Models

Gordon Swobe gordon.swobe at gmail.com
Thu Apr 27 22:55:16 UTC 2023


Here is GPT-4's summary of Ben's views:

--
Prompt:
According to Ben Goertzel, we have enough of a theory of AGI already that
it SHOULD be that clear nothing with the sort of architecture that GPT-n
systems have could really achieve Human level AGI. Why would he say that?

Completion:
Ben Goertzel, a prominent AI researcher, has expressed concerns that the
architecture of GPT-like systems, including me, might not be sufficient to
achieve human-level Artificial General Intelligence (AGI). His concerns
stem from a few key reasons:

Narrow focus on language processing: GPT-like systems excel in natural
language understanding and generation tasks. However, human cognition
involves more than just language processing. It includes learning from
sensory experiences, reasoning about the world, and adaptability. AGI would
require a system that can perform a wide range of cognitive tasks across
various domains, not just language-related ones.

Lack of understanding and reasoning: While GPT-like models can generate
coherent and contextually relevant responses, they don't truly understand
the meaning behind words and concepts in the way humans do. They learn
patterns from training data but don't have the ability to ground these
patterns in real-world experiences. This lack of understanding could be a
significant barrier to achieving AGI.

Reactive nature: GPT-like systems are primarily designed to respond to
inputs rather than to actively reason, plan, or learn from experiences. AGI
would require systems that can proactively engage with the world and learn
from their interactions.

No intrinsic motivation or goal-driven behavior: GPT-like models don't have
built-in motivations or goals like humans do. AGI systems would need some
form of intrinsic motivation and goal-driven behavior to learn and adapt to
different environments and tasks.
--
-gts
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230427/39295e41/attachment.htm>


More information about the extropy-chat mailing list