[ExI] Yes, the Singularity is the greatest threat to humanity

Anders Sandberg anders at aleph.se
Mon Jan 17 22:37:44 UTC 2011


John Clark wrote:
> On Jan 16, 2011, at 7:17 PM, Anders Sandberg wrote:
>
>> any singularity with a relatively slow acceleration [...]
>
> A slow singularity is a contradiction in terms. If you can't make a 
> prediction even a subjectively short amount of time into the future 
> that is even approximately correct then it's a singularity, if you can 
> then it's not.

The term technological singularity has misleading properties, since it 
primes intuitions of something pointlike, infinite etc. It is used in 
several meanings, a few are listed in
http://agi-conf.org/2010/wp-content/uploads/2009/06/agi10singmodels2.pdf
The senses I used it above was B and C, Self improving technology and 
Intelligence Explosion. There is no clear reason why these *have to* 
produce change faster than societal timescales, or imply a strong 
prediction horizon.

One of the big insights from todays intelligence explosion workshop was 
that people often hold very strong and confident views on whether 
intelligence explosions will be fast (and localized) or slower (and 
occur across an economy), but that they do not seem to have arguments 
for their positions that would actually justify their level of 
confidence. So I think one important conclusion is that we should not be 
confident at all about likely speeds - our intuitions are likely heavily 
biased.

-- 
Anders Sandberg,
Future of Humanity Institute
Philosophy Faculty of Oxford University 




More information about the extropy-chat mailing list