[ExI] Limiting factors of intelligence explosion speeds
Anders Sandberg
anders at aleph.se
Thu Jan 20 11:44:30 UTC 2011
One of the things that struck me during our Winter Intelligence workshop
on intelligence explosions was how confident some people were about the
speed of recursive self-improvement of AIs, brain emulation collectivies
or economies. Some thought it was going to be fast in comparision to
societal adaptation and development timescales (creating a winner takes
all situation), some thought it would be slow enough for multiple
superintelligent agents to emerge. This issue is at the root of many key
questions about the singularity (one superintelligence or many? how much
does friendliness matter?)
It would be interesting to hear this list's take on it: what do you
think is the key limiting factor for how fast intelligence can amplify
itself?
Some factors that have been mentioned in past discussions:
Economic growth rate
Investment availability
Gathering of empirical information (experimentation, interacting
with an environment)
Software complexity
Hardware demands vs. available hardvare
Bandwidth
Lightspeed lags
Clearly many more can be suggested. But which bottlenecks are the most
limiting, and how can this be ascertained?
--
Anders Sandberg,
Future of Humanity Institute
James Martin 21st Century School
Philosophy Faculty
Oxford University
More information about the extropy-chat
mailing list