[extropy-chat] Fools building AIs (was: Tyranny in place)
ben at goertzel.org
Fri Oct 6 16:22:23 UTC 2006
Rationality, as I see it, is not intrinsically correlated with either
fragility or stability.
Coupled with a goal of self-preservation, rationality can of course
lead to highly effective self-preservation...
On 10/6/06, Eugen Leitl <eugen at leitl.org> wrote:
> On Fri, Oct 06, 2006 at 11:53:30AM -0400, Ben Goertzel wrote:
> > > I don't know what profound rationality is, but your brand
> > > of it seems to exclude such basic fare like self-preservation.
> > No... it just doesn't GUARANTEE a value being placed on self-preservation...
> Then it would tend to be really fragile long-term, no?
> Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
> ICBM: 48.07100, 11.36820 http://www.ativel.com
> 8B29F6BE: 099D 78BA 2FD3 B014 B08A 7779 75B0 2443 8B29 F6BE
> -----BEGIN PGP SIGNATURE-----
> Version: GnuPG v126.96.36.199 (GNU/Linux)
> -----END PGP SIGNATURE-----
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
More information about the extropy-chat