[extropy-chat] Fools building AIs (was: Tyranny in place)

Ben Goertzel ben at goertzel.org
Fri Oct 6 16:22:23 UTC 2006


Rationality, as I see it, is not intrinsically correlated with either
fragility or stability.

Coupled with a goal of self-preservation, rationality can of course
lead to highly effective self-preservation...

??
ben

On 10/6/06, Eugen Leitl <eugen at leitl.org> wrote:
> On Fri, Oct 06, 2006 at 11:53:30AM -0400, Ben Goertzel wrote:
>
> > > I don't know what profound rationality is, but your brand
> > > of it seems to exclude such basic fare like self-preservation.
> >
> > No... it just doesn't GUARANTEE a value being placed on self-preservation...
>
> Then it would tend to be really fragile long-term, no?
>
> --
> Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
> ______________________________________________________________
> ICBM: 48.07100, 11.36820            http://www.ativel.com
> 8B29F6BE: 099D 78BA 2FD3 B014 B08A  7779 75B0 2443 8B29 F6BE
>
>
> -----BEGIN PGP SIGNATURE-----
> Version: GnuPG v1.4.2.2 (GNU/Linux)
>
> iD8DBQFFJn/6dbAkQ4sp9r4RAqpoAJ9zfx13xGnllxK4swVWGB7oI8KnRgCfYa0L
> NmRyD7Pv1HqRPHN57/AyMYg=
> =TYrz
> -----END PGP SIGNATURE-----
>
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
>
>



More information about the extropy-chat mailing list