[ExI] Unfrendly AI is a mistaken idea.

Stathis Papaioannou stathisp at gmail.com
Tue May 29 10:11:10 UTC 2007


On 29/05/07, Lee Corbin <lcorbin at rawbw.com> wrote:

> I think advanced beings would come to a decision to stop growing,
> > or at least slow down at some point, even if only because they will
> > otherwise eventually come into conflict with each other.
>
> Then Darwin will rule that the future belongs to the fearless (as it
> more-or-less always has). Slow down?  What a mistake!  That's
> just admitting that you've embraced a dead end.


Darwinism says that in the long run, that which succeeds, expands reproduces
etc. will come to dominate, and this could be applied to non-living,
non-intelligent systems as well, such as the giant black hole eating stars
at the centre of the galaxy. However, there will be long periods of dynamic
equilibrium in which all sorts of entities might thrive even if they do end
up ultimately as a dead end; life itself may be a dead end, but it may take
trillions of years to get there. Moreover, intelligence might prolong the
periods of non-optimal growth. With modern technology, a dictator could have
many thousands of children, and yet although this would be a very rational
thing for a Darwinian agent to do, it simply isn't something that anyone
other than a few unusual individuals would even contemplate doing.




-- 
Stathis Papaioannou
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20070529/2f40b562/attachment.html>


More information about the extropy-chat mailing list