<br><br>
<div><span class="gmail_quote">On 3/26/07, <b class="gmail_sendername">Lee Corbin</b> <<a href="mailto:lcorbin@rawbw.com">lcorbin@rawbw.com</a>> wrote:</span></div>
<div><span class="gmail_quote"></span> </div>
<blockquote class="gmail_quote" style="PADDING-LEFT: 1ex; MARGIN: 0px 0px 0px 0.8ex; BORDER-LEFT: #ccc 1px solid">> We could make the example more complex. Suppose that frame by<br>> frame, a gradually increasing number of squares on the Life board
<br>> are looked up, while the remainder are computed according to the<br>> usual rules...<br><br>Yes, over an increasing area, values are not computed but pulled-in,<br>in an arbitrary seeming fashion from some place
<br><br>> ...What would happen when the squares representing half the<br>> subject's visual field are looked up? He would notice that he<br>> couldn't see anything on the right and might exclaim, "Hey...
<br><br>As you know, he cannot have any such reaction...<br><br>> But the computation is proceeding deterministically just the<br>> same as if all the squares were computed; there is no way it<br>> could run off in a different direction so that the subject notices
<br>> his perception changing and changes his behaviour accordingly.<br><br>Right.<br><br>> This is analogous to David Chalmer's "fading qualia" argument<br>> against the idea that replacement of neurons by silicon chips
<br>> will result in zombification:<br><br>Oh, I never heard of that. Hmm, I suppose that what you and Chalmers<br>are really asserting is that the subject has fewer "qualia" over time,<br>but only insofar---to rephrase it---as he is becoming less an actual
<br>subject moment by moment.</blockquote>
<div> </div>
<div>No, the argument asserts that he *can't* have fewer experiences over time (that's a standard English word with the same meaning as "qualia") as his neurons or squares are being replaced, so that gradual zombification is impossible. This is because you can't have a half-zombie state where, for example, half your neurons are replaced and although the whole person says "yes, I can see the light" for the benefit of external observers, internally you are thinking "I can't see the light". If this were possible, it would disprove computationalism, because you would be having a different thought (
i.e. that you can't see the light) even though the physical processes in your non-replaced brain are and have to be exactly the same as they would have been before the brain was replaced. That is, your consciousness would have to magically drift off in a different direction, decoupled from the physical activity presumed to be underpinning it. And if gradual zombification by gradual replacement cannot happen, then sudden total zombification when the last neuron or last square is replaced also cannot happen, for it is absurd to think that your entire consciousness could be sutained by one neuron or one square.
</div>
<div> </div>
<div>Stathis Papaioannou</div>