[ExI] Unfriendly AI is a mistaken idea

Russell Wallace russell.wallace at gmail.com
Mon May 28 17:08:04 UTC 2007


On 5/28/07, Lee Corbin <lcorbin at rawbw.com> wrote:
>
> I totally agree with your words "spontaneously extrapolate
> all the ways it might get better".  It's not going to do that
> unless it's designed to do that by us and by previous
> versions of itself, OR---and this is the big "or"---these
> talents evolve spontaneously in a highly competitive
> environment where over some lengthy period of time
> multiple AIs and their descendants compete vigorously
> against each other.


Depends on what they're competing for. Today's programs are competing
vigorously against each other in a highly competitive environment, but
they're not even starting to move in the direction of such spontaneous
extrapolation. Which is unsurprising - in a man-made technosphere, the will
to power is not an adaptive quality in a machine! There's no selection
pressure in that direction, quite the opposite. It would be like expecting
bacteria in a hot spring to evolve cold tolerance.

Now if we built a lot of highly sophisticated automated factories,
putting in the $zillion billion worth of R&D to make them capable of
independently surviving in the wild and robust to small changes, and wrote a
population of AI programs to control them, and put in the $zillion worth of
R&D to give those AI programs enough of an initial will to survive that
they'd start adjusting themselves to be better at competing, and then shut
down all progress in human civilization to give them the chance to catch up
and waited a zillion years for evolution to take its course, then a
world-conquering AI just might eventually emerge.

But a theorem-proving program running on a cluster of PCs spontaneously
exhibiting such behavior is about as likely as putting a glob of mud in a
test tube and creating crocodiles by spontaneous generation.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20070528/8f94d81e/attachment.html>


More information about the extropy-chat mailing list