[ExI] Limiting factor to the Intelligence Singularity?

efc at swisscows.email efc at swisscows.email
Sat Dec 23 20:56:05 UTC 2023

On Sat, 23 Dec 2023, Kelly Anderson via extropy-chat wrote:

> Is it possible that the exponential curve towards the singularity has
> a hidden negative signal of increased resistance to progress because
> of the required size of the team? Might this be one reason that we

Hello Kelly,

I often thought about the same thing myself, especially in the face of
AI-doomers. What if there is a "hard limit" on intelligence? Wouldn't
that be a disappointment, but also, very fascinating!

It is of course not right to generalize/speculate from biological intelligence 
to non-biological intelligence (but who knows, maybe the same principles
apply?) but it seems to me that the more intelligent the person, the
higher the likelihood for psychological problems and not being able to
navigate the physical world successfully. So maybe evolution has evolved
a "functional span" for intelligence, where too much and too little are
not conducive to passing on ones genes?

Of course that's just my own subjective experience with very smart
people, so I would be interested in if there is in fact a correlation
between very intelligent people and psychological problems/instability
or if that is just an urban myth.

Best regards, 

More information about the extropy-chat mailing list