[ExI] Maybe AGI is just the latest conspiracy theory?

Jason Resch jasonresch at gmail.com
Wed Nov 5 13:03:42 UTC 2025


On Wed, Nov 5, 2025, 6:27 AM BillK via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> This very long article discusses the history and development of AGI.
> Basically, asking the question "Is there any 'there' there?".
> Or is it mostly just hype?
> BillK
>
> <
> https://www.technologyreview.com/2025/10/30/1127057/agi-conspiracy-theory-artifcial-general-intelligence/
> >
> Quotes:
> How AGI became the most consequential conspiracy theory of our time
>
> The idea that machines will be as smart as—or smarter than—humans has
> hijacked an entire industry. But look closely and you’ll see it’s a
> myth that persists for many of the same reasons conspiracies do.
>

If you read this paper, which came out a few years ago, you will see and
understand that AGI is already here:

https://arxiv.org/abs/2303.12712


By Will Douglas Heaven  October 30, 2025
>
> Stripped back to its essentials, the argument for AGI rests on the
> premise that one technology, AI, has gotten very good, very fast, and
> will continue to get better. But set aside the technical
> objections—what if it doesn't continue to get better?—and you’re left
> with the claim that intelligence is a commodity you can get more of if
> you have the right data or compute or neural network. And it’s not.
>
> Intelligence doesn’t come as a quantity you can just ratchet up and
> up. Smart people may be brilliant in one area and not in others. Some
> Nobel Prize winners are really bad at playing the piano or caring for
> their kids. Some very smart people insist that AGI is coming next
> year.
>

What we really mean when we say "general intelligence" is really just a
large number of bundled competencies.

"Each practitioner thinks there’s one magic way to get a machine to be
smart, and so they’re all wasting their time in a sense. On the other hand,
each of them is improving some particular method, so maybe someday in the
near future, or maybe it’s two generations away, someone else will come
around and say, ‘Let’s put all these together,’ and then it will be smart."
-- Marvin Minsky

Mastering language really was key, because most of accumulated human
knowledge is represented in language. And also because every problem they
requires intelligence can be framed as a particular pattern to be learned
and predicted. And language is universal for encoding and representing
those patterns.

Jason
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20251105/646008e6/attachment.htm>


More information about the extropy-chat mailing list