<html><head></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; "><div><div>On Sep 30, 2009 Giulio Prisco sent a first rate post, I agree with most of it both the letter and the spirit.</div><blockquote type="cite"><div><font class="Apple-style-span" color="#000000"><br></font>The Singularity is a clean mathematical concept—perhaps too clean<br></div></blockquote><div><br></div><div>True, it will not be a Singularity in the strict mathematical sense because the rate of change will not become infinite, just far too fast for humans to deal with. Perhaps a better word would have been "Horizon", but it's too late to change now. </div><div><br></div><div><blockquote type="cite">I suspect the change we will see in this century, dramatic and</blockquote></div><blockquote type="cite"><div>world changing as they might appear to us, will appear as just<br>business than usual to the younger generations. </div></blockquote><div><br></div>I'm not saying when it will happen but if that younger generation is still using biology sooner or later changes will happen so fast they will be unable to cope because of the pokey sub sonic signals in their brains . </div><div><div><br></div><blockquote type="cite"><div>I must admit to a certain skepticism toward FAI: if super intelligences are<br>really super intelligent (that is, much more intelligent than us), they will be</div></blockquote><blockquote type="cite"><div>easily able to circumvent any limitations we may try to impose on them. </div></blockquote><div><br></div>I agree 100%, and yet [...]</div><div><br></div><div><div></div><blockquote type="cite"><div>Eliezer Yudkowsky and the Singularity Institute for Artificial<br>Intelligence propose that research be undertaken to produce friendly<br>artificial intelligence (FAI) in order to address the dangers.</div></blockquote><div><br></div>And they've actually convinced themselves it could work! It's amazing how wishful thinking can delude even the most powerful minds.</div><div><br><blockquote type="cite"><div>Very few transhumanists think practical, operational indefinite life<br>extension and mind uploading will be a reality in the next two or<br>three decades. Probably Kurzweil himself does not _really_ believe it.<br></div></blockquote><div><br></div><div>Oh I think Kurzweil really believes it and is in fact absolutely certain of it; that doesn't mean he's correct of course, although he may be.</div><div><br></div><blockquote type="cite"><div>Similarly, I don’t see a Singularity in 2045. </div></blockquote><div><br></div>I refuse to give a date because with the significant exception of <font class="Apple-style-span" size="7"><span class="Apple-style-span" style="font-size: 36px; ">Moore's Law history has shown that our ability to pin a date on a future technological development is pretty poor, and this is before the huge acceleration that will happen as we approach the Singularity. I will say that even if the Singularity doesn't happen for a thousand years in 999 years it will still seem like a very long way off, so whenever it happens it will come as a big surprise to most. </span></font><div><br></div><blockquote type="cite"><div> I think one Kurzweil is worth thousands of critics.</div></blockquote><br>Absolutely! </div><div><br></div><div> John K Clark</div><div><br></div><br></body></html>