[ExI] Increasing coherence over increasing context? Or Truth?
Jef Allbright
jef at jefallbright.net
Mon May 18 17:17:11 UTC 2009
On Sun, May 17, 2009 at 9:55 PM, Lee Corbin <lcorbin at rawbw.com> wrote:
>
> Here is what the driving analogies are: our models
> (or theories *about*) physical reality. Suppose that
> you and I are measuring a temperature or merely the
> length of a rod. It is EXTREMELY USEFUL, I contend,
> to maintain that our measurements are converging on
> something.
Lee, once upon a time I attempted to convey to you the distinction
between precision and accuracy.
<http://lists.extropy.org/pipermail/extropy-chat/2008-October/046099.html>
During my more than three decades in the business of analytical
instruments, I worked with countless customers who would naively ask
about the accuracy (veracity, truth) of their instrument. Nearly
always, I would have to explain that the instrument performs in terms
of sensitivity, stability, and... most importantly, precision. And
that is all that is needed, entirely useful, for any of their process
control needs.
If they actually needed accuracy, then it was obtainable by taking the
measurement results and calibrating them relative to a reference
standard traceable to NIST or some other institution.
But here's the key point: If NIST were to arbitrarily modify their
standard, and everybody recalibrated to it so they again had a common
basis for comparison, everything would work just as well. Accuracy
has NO MEANING independent of context.
> Something real, i.e., though the measuring rod we
> know to be a host of dancing sub-elementary particles
> (again, we "know" as an approximation to something
> that somehow really does make up the measuring rod),
> the thing we're trying to measure is on average of
> our measurements closer and closer to something,
> and our rod is (can be measured to be) more and
> more exactly some multiple of the one they keep
> in Paris.
What do you imagine is the pragmatic difference between your "closer
and closer to Something" (implying increasing accuracy relative to a
Something which you acknowledge is inherently ultimately unknowable)
and my "increasing precision" within any particular measurement
context? Which delivers the better results? [Don't ignore the
non-negligible inefficiencies and errors introduced by any unnecessary
process.]
Lee, please don't forget that I spent over twenty years successfully
managing highly technical teams within a highly competitive
environment. I'm not speaking from the ivory towers of academia, nor
from a background in the Humanities steeped in Postmodernist
Deconstruction, nor from any vague, mush-headed mystical point of
view.
My larger point is not that epistemological reductionism is wrong, but
that as a framework for decision-making it is incomplete, with serious
ramifications for the rational application of increasing
instrumentality within an increasingly uncertain world.
- Jef
More information about the extropy-chat
mailing list