[ExI] End of time?

spike spike66 at att.net
Thu Sep 30 03:35:17 UTC 2010


> ...On Behalf Of John Grigg
> Subject: Re: [ExI] End of time?
> 
> Spike wrote:
> I was scared senseless that the Big Rip is 5 million years 
> from now.  But when they clarified it is 5 billion years 
> hence, I calmed a bit.
> >>>
> 
> Spike, I don't blame you for being scared senseless!  If 
> calorie restriction diets, Kurzweillian predictions, or 
> cryonics pan out, five million years will just not be enough 
> time to have fun!  John

Actually that was based on an Asimov gag.

Check this however.  The singularity has been identified as one of eight
possible End of the World Armageddons by FoxNews.  They don't use the term
singularity, but this description of it sounds like it to me.  I thought it
pretty cool for a mainstream media source:

http://www.foxnews.com/scitech/2010/09/29/end-of-the-world-potential-armaged
don/

[Begin quote]  

7.  Computers Take Over Everything

One potential cataclysm could already be happening -- one we've created
ourselves. As computer technology becomes more advanced, "thinking machines"
could eventually emerge that control banks, stock markets, and airports. It
sounds like something out of the Terminator movies, but the reality is that
"self-aware" machines could become self-replicating. 

Initially, this could mean just a bug that infects computer systems
controlling transportation and finance, leading to mass pandemonium. Yet a
more dangerous threat is from artificial intelligence (AI). McQuade suggests
that AI could become more advanced than human intelligence. Once it does,
the machines could develop their own programming routines -- or decide that
humans aren't necessary. Or take over nuclear armaments and other
stockpiles.

"AI is a field that seeks to engineer not just faster-than-human
intelligence, but qualitatively better than human intelligence," McQuade
said. "Because AI could learn extremely fast (through recursive
self-improvement), it would have the capacity, in a short period of time, to
make significant leaps in 'intelligence' until it demonstrates qualitatively
better-than-human intelligence."

[end quote]

spike




More information about the extropy-chat mailing list