[ExI] How could you ever support an AGI?
John K Clark
jonkc at att.net
Mon Mar 3 16:24:52 UTC 2008
Robert Bradbury Wrote:
> I believe the production of an AGI spells
> the extinction of humanity.
Me too.
> Why should I expend intellectual energy,
> time, money, etc. in a doomed species?
If you don't want to develop an AI somebody else certainly will, there is too much money and power involved for such a possibility to be ignored, not to mention the adrenalin high creating such a godlike being would bring. And if you're the first to make an AI you would have more control (very small but larger than zero) over future events than the person who came in second. It may also give these developers some comfort to know that even if they or their children do not survive their mind children will.
> those of you who have had and/or are investing
> in children are potentially pursuing a pointless endeavor.
Sucks doesn't it? Still, things aren't completely hopeless, just almost hopeless. If you or your biological children have any wish to survive they must shed the silly superstitions regarding identity and consciousness that is epidemic in society and even infects most members of this list. If they can do that then there would be no reason not to upload and engage in pedal to the metal upgrading, and if they are also very lucky they might survive.
> And so, we must present transhumanism
> as an "Extinction Level Event"
Yes.
> are willing to deal with thiat?
Well, it's not like we had any choice over the matter.
John K Clark
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20080303/2426a709/attachment.html>
More information about the extropy-chat
mailing list