[ExI] What would an IQ of 500 or 1000 look like?

Jason Resch jasonresch at gmail.com
Mon May 25 09:54:53 UTC 2015


Thanks for that link. I'm sure it will prove useful in the future.

I very much appreciated your framing of it in terms of how animals see our
behavior. It reminded me a bit of a movie I quite enjoyed:
http://www.imdb.com/title/tt0070544/ (Fantastic Planet)

With numbers like 10^100 or 10^199 that may begin to approach the number of
combinations for genes related to intelligence in the current human genome,
so I think the IQ scale has a definite upper bound where beyond which even
smarter humans aren't significantly greater in capacity or ability than
other humans even though they are comparatively far rarer and thus higher
in the scale.

Another way of looking at it is the computational capacity of the brain is
bounded by the efficiency of the neurons, and this is related to the
metabolism of the brain. There's only so much computation that a human
brain can perform when powered by a human diet. As impressive as von
Neumann's abilities were, his memory/real-time translation/calculating
ability is nothing special compared to what any computer of today can do.
On an intelligence scale for computer intelligence, since they won't be
subject to definite energy or processing limits, the scale would be far
more open-ended, I would think.

Here is some interesting reading on non-human super intelligence:


On Mon, May 25, 2015 at 4:06 AM, Anders Sandberg <anders at aleph.se> wrote:

> Jason Resch <jasonresch at gmail.com>
> The calculator breaks down above a z-score of 6, which has a probability
> of 1 in ~10 billion, corresponding to an IQ > 180. An IQ of 1,000 would
> have a z-score of 26.667, and an IQ of 1000 would be a z-score of 60. So we
> must ask, out of 10^20 or 10^30 naturally born humans how smart would that
> smartest human out of those ~10^30 be?
> A while ago I dug up an approximate formula that is applicable:
> http://www.aleph.se/andart/archives/2009/09/ten_sigma_numerics_and_finance.html
> So z=27 gives one chance in 10^160. z=60 is way outside my numerical
> precision when calculated straight, and the probability is around one in
> 10^199 when I just take the log of the equation. That is essentially one
> out of every particle that has ever or will ever existed in the observable
> universe.
> In the end, talking about IQ 500 is almost as confused as talking about
> doubling IQs: this is not what the scale is about. It is a bit like
> discussing how loud the big bang was (although, see
> https://telescoper.wordpress.com/2009/04/26/how-loud-was-the-big-bang/ )
> I think a better approach would be to consider how an animal would
> perceive human intelligence. We do incomprehensible or arbitrary things -
> generate some odd sounds, move stuff about, handle objects - and then big
> outcomes occur for often no obvious reason - food, images or rooms appear,
> other humans just do things as if they knew what we were thinking.
> Sometimes the point becomes somewhat clear far in retrospect, but most of
> the time there is no discernible link. And of course, many of the things
> that humans worry or enthuse about are things the animals simply do not get
> - why would a human get sad over a piece of paper with scribbles on?
> So I would expect superintelligences to be like this. They do stuff, stuff
> happens, and sometimes we can see that some desire and goal  seems to have
> been met. If they are human-derived we can sometimes see the similar
> drives, but also totally alien interests and drives. In many ways they
> would be confusing and boring, except when they decide to play with us.
> Anders Sandberg, Future of Humanity Institute Philosophy Faculty of Oxford
> University
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20150525/8938f36b/attachment.html>

More information about the extropy-chat mailing list