[ExI] How could you ever support an AGI?

Robert Bradbury robert.bradbury at gmail.com
Sun Mar 2 18:56:41 UTC 2008


I have not posted to the list in some time.  And due to philosophical
differences I will not engage in open discussions (which are not really
open!).

But a problem has been troubling me recently as I have viewed press releases
for various AI conferences.

I believe the production of an AGI spells the extinction of humanity.  More
importantly it has what I would call back propagating effects.  Why should
iI expend ntellectual energy, time, money, etc. in a doomed species?  Put
another way, those of you who have had and/or are investing in children are
potentially pursuing a pointless endeavor.  If an AGI develops or is
developed their existence is fairly pointless.  Our current culture
obviously shows absorption is nearly instantaneous for younger minds.  They
will know they are "obsolete" in an AGI world.

So given some limited genetic drive to keep making humans, that will last a
while.  But I see no way out of the perspective that the general development
(vs the managed development) of an AGI leads to the survival of humanity.

And so, we must present transhumanism as an "Extinction Level Event"  -- are
willing to deal with thiat?

Robert
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20080302/0b40e54e/attachment.html>


More information about the extropy-chat mailing list