[extropy-chat] Fragmentation of computations
Stathis Papaioannou
stathisp at gmail.com
Mon Mar 26 02:31:04 UTC 2007
On 3/26/07, Lee Corbin <lcorbin at rawbw.com> wrote:
> > We could make the example more complex. Suppose that frame by
> > frame, a gradually increasing number of squares on the Life board
> > are looked up, while the remainder are computed according to the
> > usual rules...
>
> Yes, over an increasing area, values are not computed but pulled-in,
> in an arbitrary seeming fashion from some place
>
> > ...What would happen when the squares representing half the
> > subject's visual field are looked up? He would notice that he
> > couldn't see anything on the right and might exclaim, "Hey...
>
> As you know, he cannot have any such reaction...
>
> > But the computation is proceeding deterministically just the
> > same as if all the squares were computed; there is no way it
> > could run off in a different direction so that the subject notices
> > his perception changing and changes his behaviour accordingly.
>
> Right.
>
> > This is analogous to David Chalmer's "fading qualia" argument
> > against the idea that replacement of neurons by silicon chips
> > will result in zombification:
>
> Oh, I never heard of that. Hmm, I suppose that what you and Chalmers
> are really asserting is that the subject has fewer "qualia" over time,
> but only insofar---to rephrase it---as he is becoming less an actual
> subject moment by moment.
No, the argument asserts that he *can't* have fewer experiences over time
(that's a standard English word with the same meaning as "qualia") as his
neurons or squares are being replaced, so that gradual zombification is
impossible. This is because you can't have a half-zombie state where, for
example, half your neurons are replaced and although the whole person says
"yes, I can see the light" for the benefit of external observers, internally
you are thinking "I can't see the light". If this were possible, it would
disprove computationalism, because you would be having a different thought (
i.e. that you can't see the light) even though the physical processes in
your non-replaced brain are and have to be exactly the same as they would
have been before the brain was replaced. That is, your consciousness would
have to magically drift off in a different direction, decoupled from the
physical activity presumed to be underpinning it. And if gradual
zombification by gradual replacement cannot happen, then sudden total
zombification when the last neuron or last square is replaced also cannot
happen, for it is absurd to think that your entire consciousness could be
sutained by one neuron or one square.
Stathis Papaioannou
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20070326/780bdd62/attachment.html>
More information about the extropy-chat
mailing list