[ExI] all we are is just llms

Jason Resch jasonresch at gmail.com
Sun Apr 23 09:39:41 UTC 2023

On Sun, Apr 23, 2023, 2:05 AM spike jones via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> *From:* extropy-chat <extropy-chat-bounces at lists.extropy.org> *On Behalf
> Of *Jason Resch via extropy-chat
> *…*
> >…Imagine we took GPT-4 back to 1980 or 1960. Is there any doubt people of
> that time (including AI researchers) would consider GPT-4 an AGI?  Jason
> Jason that question demonstrates how far we have been able to move the AI
> goalposts since those benighted days.  Had you showed us GPT-4 in 1980, we
> would have unanimously called it artificial intelligence at or above human
> level.  I was there, I remember.  We have moved the goalposts so far, we
> couldn’t even see the original goalpost site given the Hubble space
> telescope.

Great perspective spike. For the fun of it I decided to compare the
Wikipedia article on AGI of the past. This is how it was defined in 2009:


"- reason, use strategy, solve puzzles, and make judgements under
- represent knowledge, including commonsense knowledge;
- plan;
- machine learning;
- communicate in natural language;
- and the ability to integrate all these skills towards common goals."

GPT-4 obviously meets all of these. We could argue about learning, but it's
clear that within a session of interaction GPT-4 learns and can be
instructed. It also learned everything it knows from reading, which is
another example of machine learning. But compare this definition to that of


"An artificial general intelligence (AGI) is a hypothetical intelligent
agent which can learn to replicate any intellectual task that human beings
or other animals can.[1][2] AGI has also been defined alternatively as an
autonomous system that surpasses human capabilities at the majority of
economically valuable work.[3]"

Soon we will be defining AGI as AI that never makes mistakes. Or AI that
has no discernable biases. Or AI that reasons perfectly as possible using
Bayesian inference, etc...


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230423/6c7342e6/attachment.htm>

More information about the extropy-chat mailing list