[ExI] Limiting factors of intelligence explosion speeds
Richard Loosemore
rpwl at lightlink.com
Fri Jan 21 02:21:26 UTC 2011
spike wrote:
> ... On Behalf Of Eugen Leitl
> ...
>
>>> ... human level of thinking and creativity would be more effective if it
> were happening (say) a thousand times faster than it does now.
>
>> ...Run a dog for a gigayear, still no general relativity... Eugen
>
> Hmmm that's a strong claim. The common ancestor of humans and dogs are less
> than a tenth of a gigayear back. That was enough time to evolve both modern
> dogs and beasts capable of discovering general relativity.
>
> If you meant running a dog under conditions that disallow genetic drift,
> that might have missed the point of speeding up the sim.
Eugen's comment -- "Run a dog for a gigayear, still no general
relativity" -- was content-free from the outset.
Whoever talked about building a dog-level AGI?
If a community of human-level AGIs were available today, and were able
to do a thousand years of research in one year, that would advance our
level of knowledge by a thousand years, between now and January 20th
next year.
The whole point of having an intelligence explosion is to make that rate
of progress possible.
What has that got to do with running a dog simulation for a billion years?
Richard Loosemore
More information about the extropy-chat
mailing list