<br><br><div><span class="gmail_quote">On 2/14/06, <b class="gmail_sendername">Robert Bradbury</b> <<a href="mailto:robert.bradbury@gmail.com">robert.bradbury@gmail.com</a>> wrote:<br>
<br>
</span><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;"><div><div>up on us and suddenly manifest
itself as the overlord or that each country is going to try to build
its own superintelligence. What I was attempting to point out is
that we don't need to allow that to happen. A Playstation 5 or 6
is probably going to have the computational capacity to enable more
than human level intelligence (though I doubt the computational
architecture will facilitate that). One can however always unplug
them if they get out of line.
<br></div><br>Its obviously relatively easy for other countries to
detect situations where some crazy person (country) is engaging in
unmonitored superintelligence development. Anytime they start
constructing power generating capacity significantly in excess of what
the people are apparently consuming and/or start constructing cooling
towers for not only a reactor but a reactor + all of the electricity it
produces then it will be obvious what is going on and steps can be
taken to deal with the situation.
</div></blockquote><div><br>
And I maintain that it will *not* be obvious.<br>
The kind of 'petty' superintelligence that only requires a puny 1MW of
power utilised as efficiently as the Human brain would yield an
intelligence some 10,000x greater than Human. IMO just a factor of 10
could prove exceedingly dangerous, let alone 10,000<br>
<br>
And such an intelligence is going to be able to provide very significant incentives for it's 'owners' not to pull any plugs.<br>
<br>
Dirk<br>
</div><br></div><br>