[extropy-chat] Best To Regard Free Will as Existing

Lee Corbin lcorbin at rawbw.com
Thu Apr 5 00:31:27 UTC 2007


Eugen writes

> On Tue, Apr 03, 2007 at 12:59:58PM -0700, Lee Corbin wrote:
>> It was inevitable that this discussion come sooner or later to hinge right
>> on the point of free will.
> 
> How can you prove you've got free will, or not, empirically?

For exactly what kind of information are you asking?  I'm not sure
anything can be "proved" outside of mathematics. Can you prove
that the island of Manhatten exists?


The kinds of systems that should be said to be willful are those
that exhibit a certain integrity of purpose.

Systems, then, that are willful and make decisions can be 
said to have free will when---as I said in my earlier email---
when they analyse a great deal of data towards the resolution
of some query, or towards picking a single action from a large
number of possible actions. (Take that as a characterization,
not a definition.)

For example, a weather forecasting program ought to be able
to be said to have free will, since its output is a complex function
of all the inputs that must be carefully weighed.   The system
would be said to be exhibiting almost no free will if a programmer
maliciously stuffed data into an output buffer, or somewhere near
the terminus of the calculation, which effectively short-circuited
the calculation.

Do you ever decide anything?  If so, how can you defend the
notion that you decide things, when---if we live in determinism
---the answer was decided long ago.  I can answer that one
myself, but can you?

> If I gave you a dump from /dev/random along with /dev/urandom,
> would you be able to tell those apart?

Given enough bytes from urandom, I can presumably eventually
discover entropy falling off. But pray, what has this to do with anything?

> What does it even mean in a process that is us, since the top-level
> lags in realization of a made-up decision at a lower level.

I may identify with the entire process, top to bottom. If the real
reason that I turn down a certain applicant is because she smells,
I still should accept responsibility for that decision, even though
I've meanwhile unconsciously fabricated all sorts of rationalizations
for why she isn't suitable for the job.  True:  my decision wasn't
quite as free as it would have been if my nasal equipment hadn't
led to a short-circuiting of my decision process.

> Assuming you knew the universe is deterministic at sub-Planck
> level, would it help you to make a killing on blue chips, or
> even avoid that car speeding round the corner?

Why, no, of course determinism doesn't help *me* make decisions.
They're still mine to make;  determinism changes nothing. 

Time to go on offense again:  if all your actions are predetermined,
why do you try so hard to avoid the speeding car?  Oh---because
it was determined that you would so strive?  I see.  But then, isn't
that an explanation or reason for everything you do?

Lee




More information about the extropy-chat mailing list