[ExI] time travelling ai

Gregory Jones spike66 at att.net
Wed Sep 15 03:00:56 UTC 2010



--- On Tue, 9/14/10, Damien Broderick <thespike at satx.rr.com> wrote:

> Subject: Re: [ExI] time travelling ai

> On 9/14/2010 8:36 PM, Brent Allsop wrote:
> > Time travel alone is absurd to think about, this just
> seems to make it
> > all the more absurd, for what?  I must be missing
> something with all this?
> 
> The joke? The whimsy?

Ja, the time travelling AI was not intended to be taken too seriously, but there is an underlying concept to this I was hoping some would catch, as Damien did.  

Back about 10 years ago here we used to talk a lot more about the singularity than we do now, and there was then an understanding of AI that I fear we have lost to some degree.  Eliezer wrote about it a lot: the hard takeoff scenarios for the singularity.  Hard takeoff is the only version of AI that he would ever entertain as I recall.  He would argue that hard takeoff is the only scenario for emergent AI that makes any sense at all.  I disagree, but ignore that for now.

In that hard takeoff scenario (as I understand it) the seed AI is not necessarily dependent on any future computing technology, rather just a software concept no one has yet discovered.  In this version of the singularity, the seed AI could run perfectly well on our current desktop and laptop computers, even on old ones.  We are not waiting for faster or even more computing power.  We are waiting for someone to discover the means to create a recursively self improving AI, leading to sentience.

That being said, consider my previous whimsical parodox of an AI being so clever as to master time travel, then it goes back in time to make arrangements to invent itself.  The point is this: before we go boldly predicting how the singularity will be and how we humans will do this and that and other, consider there might be some unknown something which prevents us from getting there from here.  We don't really know.  

I like the term singularity, because it implies something over the event horizon, where all our usual notions of how things work are not applicable.  We don't know what happens after the singularity.  Any predictions are likely to be wrong.

spike


 




More information about the extropy-chat mailing list