<br><div><span class="gmail_quote">On 9/8/06, <b class="gmail_sendername">Ben Goertzel</b> <<a href="mailto:ben@goertzel.org">ben@goertzel.org</a>> wrote:</span><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
<br>** examples of Singularity-list issues would be: discussions of<br>Friendly AI which don't pertain to specifics of AGI architectures;<br>discussions of non-AGI Singularity topics like nanotech or biotech, or<br>Singularity-relevant sociopolitical issues.
</blockquote><div><br>Of course one *could* separate the classical singularity concept [1] as it is quite possible for lifespan extension, solutions to energy problems, advances in educational systems and even robust nanotechnology to occur enabling a relatively robus transhumanist flavored future *without* any of the disruption a rapid rate singularity era (including AGI).
<br><br>Robert<br><br>1. "Classical" meaning when change gets too fast for standard issue humans to absorb it.<br></div><br></div><br>