[ExI] what if... the singularity isn't near?
John Clark
johnkclark at gmail.com
Thu Nov 6 14:08:05 UTC 2025
On Wed, Nov 5, 2025 at 6:22 AM Ben Zaiboc via extropy-chat <
extropy-chat at lists.extropy.org> wrote:
On 05/11/2025 03:39, spike wrote:
>
> >> it appears to me that what we are calling AI might be a kind of false
>> alarm
>
>
>
>
>
>
> *> I for one think it is, at least in the sense that what we call AI now
> is not going to develop into general intelligence, much less
> superintelligence. It's far too narrow in the kind of information
> processing it does, compared to the only generally-intelligent thing we
> know, our own brains.*
>
*Today AIs are not better than the best humans at everything, but they are
better than the best humans at some things, and those things were once
thought of as excellent examples of intelligence. And today AIs are better
than the average human at nearly everything except for manual dexterity.
For some reason people treat "AGI" and "Superintelligence" as synonyms, and
that causes nothing but confusion. *
*> *
>
> *If intelligence is substrate-specific, and biology is the only thing that
> can sustain it, then we're really in trouble, and I think all bets are off.*
*I think the likelihood of that being true is about equal to the likelihood
that the holy rollers and snake handlers will turn out to be right. And
I'm not holding my breath. *
>
> * > In fact, if that is true, then all of science is in trouble. We might
> as well start taking religion and other kinds of magical thinking
> seriously.*
>
*Exactly. *
*John K Clark*
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20251106/cc09fb8e/attachment.htm>
More information about the extropy-chat
mailing list