[extropy-chat] Criticizing One's Own Goals---Rational?

Ben Goertzel ben at goertzel.org
Thu Dec 7 14:25:34 UTC 2006


> > If rationality is using cognition to find ways of achieving goals, then
> > using cognition to erase goals would be irrational.

The relationship between rationality and goals is fairly subtle, and
something I have been thinking about recently....  To address the
issue I will introduce a series of concepts related to goals.


A supergoal is defined as a goal of a system that is not a subgoal of
any other goal of that system, to a significant extent.

With this in mind, regarding creation and erasure of goals, there are
two aspects which I prefer to separate:

1) optimizing the set of subgoals chosen in pursuit of a given set of
supergoals.  This is well-studied in computer science and operations
research.  Not easy computationally or emotionally, but conceptually
straightforward to understand.

2) optimizing the set of supergoals.  This is a far, far subtler thing.

Supergoal optimization must be understood from a perspective of
dynamical systems theory, not from a perspective of logic.

A strongly self-modifying AI system will be able to alter its own
supergoals....  So can a human, to an extent, with a lot of effort....


Next, I think it is worthwhile to distinguish two kinds of goals
-- explicit goals: those that a system believes it is pursuing
-- implicit goals: those a system acts like it is pursuing

Definition: a "coherent goal achiever" is one whose implicit goals and
explicit goals are basically the same

What is interesting, then, is the dynamics of coherent goal achievers
that are also strongly enough self-modifying to modify their
supergoals....  In this case, what properties control the evolution of
the supergoal-set over time?  This is closely related to Friendly AI,
of course....


Next, there is the notion of a "meta-goal", a supergoal  designed to
coexist with other supergoals and to regulate the process of supergoal

For instance, a friend of mine has a metagoal of streamlining and
simplifying his set of supergoals.  I have a metagoal of making sure
my various sometimes-contradictory supergoals all cooperate with each
other in an open and friendly way, rather than being competitive and


To me, rationality has two aspects:

1) how effectively one achieves one's explicit goals, given the
constraints imposed by the resources at one's disposal.

2) how coherent one is as a goal-achiever (implicit goals = explicit goals)

IMO, revising one's supergoal set is a complex dynamic process that is
**orthogonal** to rationality.  I suppose that Nietzsche understood
this, though he phrased it quite differently.  His notion of
"revaluation of all values" is certainly closely tied to the notion of
supergoal-set refinement/modification....

Refining the goal hierarchy underlying a given set of supergoals is a
necessary part of rationality, but IMO that's a different sort of

In general, it would seem important to be aware of when you are
non-rationally revising a supergoal versus "merely" rationally
modifying the set of subgoals used to achieve some supergoal.  And
yet, the two processes are very closely tied together.


One very common phenomenon is when a supergoal is erased, but one of
its subgoals is promoted to the level of supergoal.  For instance,
originally one may become interested in science as a subgoal of
achieving greatness, but later on one may decide greatness is childish
and silly, but retain the goal of being a scientist (now as a
supergoal rather than a subgoal).

When subgoal promotion happens unintentionally it is called subgoal
"alienation."  This happens because minds are not fully self-aware.  A
supergoal may be erased without all subgoals that it spawned being
erased along with it.  So, e.g. even though you give up your supergoal
of drinking yourself to death, you may involuntarily retain your
subgoal of drinking (even though you started doing it only out of a
desire to drink yourself to death).

-- Ben G

More information about the extropy-chat mailing list