[ExI] Hard Takeoff

John Grigg possiblepaths2050 at gmail.com
Fri Nov 19 07:58:05 UTC 2010


Spike wrote:
>The real risks are not those on which the world seems so focused.  The
>commies fizzled out without a bang, global warming isn't coming for us,

Arnold will be coming for you when he finds out you don't believe in
global warming!

>Malthusian population growth isn't going to kill us, radicalized
>Presbyterians are not coming,

Radicalized atheists and Evangelicals living in the same nation, scare
me!!! lol  Oh, and the militant front of the Salvation Army needs to
be watched closely...

But seriously, Spike, what if the AGI turns out to be *gay???*  I
worry about this all the time, but the Evangelicals have yet to write
books about it or churn out documentaries!  I'd rather have a
heterosexual or at least non-sexual AGI enslave or wipe out humanity,
than endure a flamingly homosexual AGI being "nice" to everyone, and
doing massive planetary redecorating...

John  ; )


On 11/18/10, spike <spike66 at att.net> wrote:
> ...On Behalf Of Mike Dougherty
> Subject: Re: [ExI] Hard Takeoff
>
> On Tue, Nov 16, 2010 at 5:31 PM, spike <spike66 at att.net> wrote:
>>>>> We know the path to artificial intelligence is littered with the
>>>>> corpses of those who have gone before.  The path beyond artificial
>>>>> intelligence may one day be littered with the corpses of our dreams,
>>>>> of our visions, of ourselves.
>>
>>>>Gee Spike, isn't it difficult to paint a sunny day with only black paint?
>>
>>> Mike we must recognize both the danger and promise of AGI.  We might
>>> have only one chance to get it exactly right on the first try, but only
> one.
>
>>Agreed.  I just hope vigilance doesn't necessarily have be so gloomy.  :)
> ...
>>I'm just joking (mostly) - as your words made me both laugh and recoil...
>
> Good, then I did it right.  Laughing and recoiling is a good combination.
> Mike, emergent AGI is one of very few areas where I get dead serious.  I
> once thought nanotech gray goo was the biggest threat to humanity, but since
> Ralph Merkel's talk at Nerdfest, I came to realize to my own satisfaction
> that humanity would not master nanotech.  Rather, a greater than human AGI
> would do it.
>
> The real risks are not those on which the world seems so focused.  The
> commies fizzled out without a bang, global warming isn't coming for us,
> Malthusian population growth isn't going to kill us, radicalized
> Presbyterians are not coming, certainly not in the time scale we have left
> before this singularity is likely to show up.  I am convinced to my own
> satisfaction that the most likely scenario is an AGI somehow comes into
> existence, then it does what lifeforms do: reproduces to its capacity, by
> converting all available metals to copies of itself, with the term metal
> meaning everything that is left of the far right hand column of the periodic
> chart.  It may or may not upload us.  It may or may not attempt preserve us
> in our current form.
>
> Our current efforts might influence the AGI but we have no way to prove it.
> Backing away from the AGI development effort is not really an option, or
> rather not a good one, for without an AGI, time will take us all anyway.  I
> give us a century, two centuries as a one sigma case.
>
> Mike, given that paradigm, are my previous comments understandable?
>
> spike
>
>
>
>
>
>
>
>
>
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>




More information about the extropy-chat mailing list