[ExI] Increasing coherence over increasing context? Or Truth?

Jef Allbright jef at jefallbright.net
Fri May 22 15:45:01 UTC 2009


On Thu, May 21, 2009 at 3:31 PM, Lee Corbin <lcorbin at rawbw.com> wrote:

> Visually, since at least age 18, I've pictured
> knowledge as residing (almost always) in the
> heads of entities, while what they know *about*
> lies outside.

That's a bit jarring to me in its mid-twentieth century classical
simplicity.  My experience with machine learning and adaptive systems
informs me that it's not "knowledge" per se, but an /encoding/ that is
meaningless without the particular decoder (person), within an
applicable environment.  Not to embark on another thread about our
differences, but it may shed some light on the (fundamental,
information-theoretic) difficulty of conveying, not facts or
information, but context.


>> I will observe here, again, that you seem to be searching for Truth by
>> looking closer and closer, rather than finding truth in the
>> regularities observable in the bigger picture.
>
> That could be a key difference, good point.

Some of my most technical engineers were blind to very real but very
contextual factors.  I used to tell them "Fix the customer, and then
fix the instrument if necessary."  Some got it.  Others would argue
endlessly that it *must* be the other way around.


> So very often I do want to look closer and closer
> into things, and often don't find the context
> informative. (Of course, there are plenty of
> examples where this is obviously a very dumb
> thing to do.) Sometimes we really do look to
> the 3rd or 4th significant difference in a
> measurement, where the context is
> understood (say a science laboratory).

Yes, and true also of solving mathematical problems, and sadly, most
schoolwork assignments.  Also I wonder and worry a bit about the
cognitive developmental influence of typical video games, where the
"rules of reality" are deduced within the closed context of the games,
strongly reinforcing deep background assumptions about the nature of
reality, and the relationship of the observer to the observed.  Does
this apply in part, for example, to the belief that there can be a
singleton AI assuring correct and safe passage for humanity?


<snip>


>> Yes.  it's analogous to how our national security apparatus has
>> traditionally operated more like a surgical team than as an immune
>> system, and how we see politics more as zero-sum conflict over
>> scarcity than positive-sum cooperation for increasing abundance. And
>> how most of us still see moral issues in terms of what is Right (the
>> inherited context), or in terms of maximizing expected utility (the
>> presently perceived context), but rarely in terms of promoting an
>> increasing context of increasingly coherent [hierarchical,
>> fine-grained] evolving values into an ever-broadening future.

Or how the operation of a voltage-follower transistor circuit doesn't
make sense without also considering the load.  ;-)


> Those errors are familiarly categorized also by
> those of us who either have acquired some wisdom
> (e.g. about economics). Sorry, I've tried, but I
> still draw a blank when trying to do more than
> superficially place such errors/bad habits into
> the great scheme of things, including "the
> ever-broadening future".

Lee, thanks for the exchange.  It's ironic that the project of mapping
the evolutionary tree of our hierarchical, fine-grained values, the
phylogeny according to which we express our preferences, is analogous
to the project now recognized as mainstream science, but when it comes
to understanding our values (rather than our instrumentality) we're
still in the alchemical stage.  Understandable of course, give the
"ineffability of qualia" and the "singularity of self" which, like
phlogiston and élan vital, seem quite coherent within the context of
their time.

- Jef



More information about the extropy-chat mailing list