[ExI] Hard Takeoff

Samantha Atkins sjatkins at mac.com
Mon Nov 15 17:04:07 UTC 2010


On Nov 14, 2010, at 9:59 AM, Michael Anissimov wrote:

> Here's a list I put together a long time ago:
> 
> http://www.acceleratingfuture.com/articles/relativeadvantages.htm
> 
> Say I meet someone like Natasha or Stefano, but I know they haven't been exposed to any of the arguments for an abrupt Singularity.  Someone more new to the whole thing.  I mention the idea of an abrupt Singularity, and they react by saying that that's simply secular monotheism.  Then, I present each of the items on that AI Advantage list, one by one.  Each time a new item is presented, there is no reaction from the listener.  It's as if each additional piece of information just isn't getting integrated.  
> 
> The idea of a mind that can copy itself directly is a really huge deal.  A mind that can copy itself directly is more different than us than we're different from most other animals.  We're talking about an area of mindspace way outside what we're familiar with.  
> 
> The AI Advantage list matters to any AI-driven Singularity.  You may say that it will take us centuries to get to AGI, so therefore these arguments don't matter, but if you think that, you should explicitly say so.  The arguments about whether AGI is achievable by a certain date and whether AGI would quickly lead to a hard takeoff are separate arguments -- as if I need to say it.  

With full acknowledgement of AI advantage, which I certainly understand, it is quite speculative how hard a takeoff will or will not ensue when AGI is achieved.  

> 
> What I find is that people don't like the *connotations* of AI and are much more concerned about the possibility of THEM PERSONALLY sparking the Singularity with intelligence enhancement, so therefore they underestimate the probability of the former simply because they never care to look into it very deeply.  There is also a cultural dynamic in transhumanism whereby interest in hard takeoff AGI is considered "SIAI-like" and implies that one must be culturally associated with SIAI.  

How come?  I J Good was talking about this nearly half a century ago.  It is not a SIAI specific meme in the least.   But I have noticed the tendency of many to associate an idea with its currently most prolific or most well known expounders. 

- s

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20101115/bef747d7/attachment.html>


More information about the extropy-chat mailing list