[extropy-chat] Fools building AIs (was: Tyranny in place)

Ben Goertzel ben at goertzel.org
Sat Oct 7 03:25:58 UTC 2006


Hi,

> "In economics, sociology, and political science, a decision or situation is often called rational if it is in some sense optimal, and individuals or organizations are often called rational if they tend to act somehow optimally in pursuit of their goals.

Acting optimally in pursuit of one's goals is a decent interpretation
of the "rationality" concept.

At least, it makes clear that rationality is not tied to any
particular goal system (e.g. individual survival).

I would temper it though to "Acting optimally in pursuit of one's
goals, where the optimization takes into account one's intrinsic
computational constraints."

Another sense of rationality could be defined in terms of logic: If a
person is rational, and they accept a set of premises, they should
accept all logical conclusions of those premises, if the proofs of
these conclusions are short OR if they are explicitly shown the
proofs.

This logic-based definition of rationality also is not tied to any
particular goal system...

-- Ben



More information about the extropy-chat mailing list