<br><br><div><span class="gmail_quote">On 2/16/06, <b class="gmail_sendername">Mikhail John</b> <<a href="mailto:edinsblood@hotmail.com">edinsblood@hotmail.com</a>> wrote:</span><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
We've got real big bombs now. Unless the superintelligence discovers exotic<br>new physics applications, say, a force field, those bombs are going to hurt.<br>I'm assuming that it would be difficult to maintain a distributed intellect
<br>while boiling oceans and ripping continents apart, and once centralized the<br>AI will be (relatively) open to attack. Even when distributed you could<br>severely inconvenience it by severing internet hubs or somesuch, possibly
<br>opening it up for viral attack.<br><br>Because while it would difficult to boost a continent to orbit, a Von<br>Neumann machine would be effortless. Once in space, expansion becomes<br>easier.<br>.</blockquote><div><br>
<a href="http://www.transhumanism.org/index.php/th/print/293/">http://www.transhumanism.org/index.php/th/print/293/</a><br>
<br>
Dirk <br>
</div><br></div><br>