[ExI] How Close is the AI Intelligence Explosion?
Jason Resch
jasonresch at gmail.com
Mon Mar 24 19:43:23 UTC 2025
On Fri, Mar 21, 2025 at 4:10 PM BillK via extropy-chat <
extropy-chat at lists.extropy.org> wrote:
> This article from the Future of Life Institute suggests that the
> intelligence explosion could be only a few years away.
> BillK
>
> <https://futureoflife.org/ai/are-we-close-to-an-intelligence-explosion/>
> Quotes:
> Are we close to an intelligence explosion?
>
Great article. I particularly liked this chart:
https://futureoflife.org/wp-content/uploads/2025/03/Test-scores-of-Al-systems-capabilities-Our-World-in-Data-1024x723.png
I can't help but feel that we are presently in the midst of an intelligence
explosion. People more and more are relying on AI to make more intelligent,
better informed decisions in their day to day life or jobs.
To summarize books and articles so they can absorb more information more
quickly.
AI researchers are using AI to write better, more optimized code for future
iterations of AI. Humans, for now, remain part of this loop, which is why
it is not going as quickly as it could. But as time progresses, I think
humans are gradually stepping back and becoming an ever smaller part of
this recursive loop of self improvement.
>
> AIs are inching ever-closer to a critical threshold. Beyond this
> threshold are great risks—but crossing it is not inevitable.
> Published: March 21, 2025 Author: Sarah Hastings-Woodhouse
>
Even exponential functions are continuous, so there may never be a feeling
of a discontinuous break-out-moment, we'll just see an ever increasing
slope of ever-faster progress per unit of time.
> Timelines to superintelligent AI vary widely, from just two or three
> years (a common prediction by lab insiders) to a couple of decades
> (the median guess among the broader machine learning community). That
> those closest to the technology expect superintelligence in just a few
> years should concern us. We should be acting on the assumption that
> very powerful AI could emerge soon.
>
AI is arguably already smarter than humans (by most metrics, anyway,
according to that chart). How exactly are we to define "superintelligence"
beyond the vaguely-defined "smarter than any human"? There are many levels
of superintelligence beyond human intelligence. Humans are much closer to
the intelligence of a pocket calculator, than to the level of a Matrioshka
brain or kilogram of computronium:
https://docs.google.com/spreadsheets/d/1_8QfebbBvQXo_3OroBhOfp24RAJPKCM4e_q5njbfBbU/edit?usp=sharing
Jason
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20250324/f5a8d7f0/attachment.htm>
More information about the extropy-chat
mailing list