[ExI] Sensory Reality

Aware aware at awareresearch.com
Wed Dec 23 20:19:57 UTC 2009


On Wed, Dec 23, 2009 at 10:43 AM, JOSHUA JOB <nanite1018 at gmail.com> wrote:

>> of reality. For the most part, "we" are not in that functional loop.
>
> We most certainly are...

Huh.  Well, given your certainty, what do you suppose might have made
Jef say such a thing?


>> ...we nevertheless feel that (based on our
>> undeniable experience) somewhere in the system there must be such
>> essentials as "qualia", "meaning", "free will" and "personal
>> identity."  Never had it, don't need it.  Derivative, yes; essential, no.
>
> Meaning is a relation...
> Free will is simply...
> Personal identity is simply...

I AM suggesting that your view is a little too simple...


>>> We revise our model continually based on all of
>>> our sense-perceptions, and over time it gets better and better, closer
>>> and closer to reality as it is "in-itself."
>>
>> While this is the popular position among the scientifically literate
>> (including most of my friends), similar to the presumption that
>> evolution continues to "perfect" its designs, with humans being "the
>> most evolved" at present, it's easy enough to show that our scientific
>> progress may be leading us in a present direction quite different from
>> the direction we may be heading later.  If 50 years from now we
>> commonly agree that we're living within a digital simulation, would
>> you then say Newton or Einstein was closer to the absolute Truth of
>> gravity?  The best we can aim for is not increasing "Truth", but
>> increasing coherence over increasing context of observation.
>
> Well evolution perfects its designs for the given context.

Not "perfects", but satisfices, within a particular (eventually
changing) environment.

> Humans are just
> as evolved as alligators in that sense.

Huh?  I could argue that alligators are more adapted to their usual
environment, or I could argue that humans are more adapted to a more
complex environment, but "just as"?


> Just because our current theories are likely incorrect does not mean we
> aren't moving toward truth.

It doesn't mean they necessarily /are/, at any particular moment, either.

> Our work on our present theories is in the worst
> case necessary for us to rule them out is incorrect in some way, thereby
> eliminating another possibility (and likely in the process a whole class of
> possibilities along with it), thereby moving a bit closer toward a correct
> representation.

This idea of successive approximation to absolute truth, which is
popular accepted and even taught in schools, is the misconception I
was trying to highlight for you.


> The more things your model can predict (not explain, predict in advance)
> without getting some things wrong, the better it is.

Yes, at any given moment.

Consider the information content of an account of THE TRUTH of our
world 1000 years ago.  How does that compare with the information
content of a true account of our world, EFFECTIVE FOR PREDICTION
today?  What about THE TRUTH, EFFECTIVE FOR PREDICTION, 50 years from
now?  Can you really argue, on the basis of predictive effectiveness,
that we keep getting closer to knowing THE TRUTH?

>From a practical point of view, how might it effect your long term
risk management and social policy if you knew that, rather than
society getting closer and closer to THE TRUTH, we are actually
getting increasing instrumental effectiveness within an environment of
even greater increasing uncertainty?


> By saying
> that the most you can hope for is increased coherence and increased context
> of observation, you are saying that the most you can hope for is getting
> closer to the truth. After all, if you take the limit of increasing
> coherence and context of observation, what do you get? A true representation
> of reality.

The key point you seem to missing is that at any given moment, you
have no way of knowing, no frame of reference, telling you whether
you're moving toward or away from the "Truth" you'll find yourself at
after 10, 50, 100 years more observation, nor do you know how long
you've been moving in that direction (since you can't know what it
is), nor do you know how close "to the limit" (1%, 10%, 90%) you are.
So in a practical and moral sense, the best you can do is strive for
increasing coherence over increasing context of observation.  There is
nothing more, there never was.  So much for ultimate grounding,
ultimate Truth.


> So you are moving in that direction...

Which direction?  If you were to say "outward" then as an extropian I
suppose I would agree...


> ...even if for some reason you (and many others in philosophy,
> particularly philosophy of science) do not think so

Want another practical example of this "philosophy"?  Well, consider
two agents (tribes, for example) in conflict, each with their own
"truth."  How should they approach agreement for their mutual benefit?
 I've already given you the answer.


> for whatever reason.

Your "for whatever reason" seems significant, especially since you
deleted without comment the part of my post where I said the
following:

>> The difference is one of context, and the test to distinguish which is
>> which is whether one can coherently explain the basis for the other's belief.

- Jef



More information about the extropy-chat mailing list