On 5/16/06, <b class="gmail_sendername">Samantha Atkins</b> <<a href="mailto:sjatkins@mac.com">sjatkins@mac.com</a>> wrote:<br>
<div><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;"><div><div style=""><div>I
understand it to be what happens after such greater than human
intelligence come about. I don't see any way to hard takeoff
without this.</div></div></div></blockquote></div><br>
Oh, okay - well, I don't think I can predict in any detail what will
happen after superintelligence comes about. I don't subscribe to the
idea that there's something specially unpredictable about
superintelligence, but there are too many variables between now and
then. I think it's a question worth revisiting in a few decades when we
might have a clearer idea of the pathways towards superintelligence in
the first place.<br>