<br><br><div class="gmail_quote">On Thu, Jun 12, 2008 at 11:01 PM, hkhenson <<a href="mailto:hkhenson@rogers.com">hkhenson@rogers.com</a>> wrote:<br><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
<div class="Ih2E3d">
<br>
</div>I am surprised you would even consider the singularity in terms of<br>
geography. If a smarter-than-human AI existed anywhere within light<br>
hours of the net it would have huge effects unless it was blocked<br>
from communicating. If it was blocked, it could be co-located with<br>
MAE West and have no impact.</blockquote><div><br>Keith, a smarter-than-human intelligence could be an enhanced human, network of interfaced humans, or some other non-AI superintelligence. These possibilities were introduced in Vinge's original essay.<br>
<br></div><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">Having been locked up in solitary confinement with very<br>
limited communications recently I can state that you *really* don't<br>
want to do that to something smarter than humans. It's bad enough to<br>
lock up an engineer.</blockquote><div><br>You wouldn't want to do it because it would prevent the AI from helping us, but to attribute feelings of resentment to an AI because you lock it up is attributing human psychology to a non-human entity.</div>
</div><br>-- <br>Michael Anissimov