I have not posted to the list in some time. And due to philosophical differences I will not engage in open discussions (which are not really open!).<br><br>But a problem has been troubling me recently as I have viewed press releases for various AI conferences.<br>
<br>I believe the production of an AGI spells the extinction of humanity. More importantly it has what I would call back propagating effects. Why should iI expend ntellectual energy, time, money, etc. in a doomed species? Put another way, those of you who have had and/or are investing in children are potentially pursuing a pointless endeavor. If an AGI develops or is developed their existence is fairly pointless. Our current culture obviously shows absorption is nearly instantaneous for younger minds. They will know they are "obsolete" in an AGI world.<br>
<br>So given some limited genetic drive to keep making humans, that will last a while. But I see no way out of the perspective that the general development (vs the managed development) of an AGI leads to the survival of humanity.<br>
<br>And so, we must present transhumanism as an "Extinction Level Event" -- are willing to deal with thiat?<br><br>Robert<br><br>