<br><br>On Thursday, February 19, 2015, John Clark <<a href="mailto:johnkclark@gmail.com">johnkclark@gmail.com</a>> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div class="gmail_extra"><div class="gmail_quote">On Wed, Feb 18, 2015 Stathis Papaioannou <span dir="ltr"><<a href="javascript:_e(%7B%7D,'cvml','stathisp@gmail.com');" target="_blank">stathisp@gmail.com</a>></span> wrote:<font color="#500050"><br><br></font><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex"><div dir="ltr"><div><div>> Functions in the brain are, to an extent, localised. </div></div></div></blockquote><div><br></div><div>Memory doesn't seem to be localized, and there is no way to know, or at least no way to prove, if consciousness is.</div></div></div></div></blockquote><div><br></div><div>It is usually believed that the various types of experiences are localised to cortical regions. However, if it is diffusely spread over he brain the argument just needs to be modified slightly so that the replacement is of part of the putative consciousness-generating mechanism.</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div class="gmail_extra"><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex"><div dir="ltr"><div><div>> if a part of the brain is damaged it results in specific deficits in function, while other functions are left unaffected.</div></div></div></blockquote><div><br></div><div>If region X of the brain is damaged we know that some behaviors change and others do not, we can make educated guesses but we have no way of PROVING if consciousness is destroyed or not. </div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex"><div dir="ltr"><div><div>> So if the visual cortex is taken out the subject can't see, although he can speak normally</div></div></div></blockquote><div><br></div><div>And the exact same thing would happen if the eyeballs of the subject were taken out. What does that teach you about the nature of consciousness? Nothing as far as I can tell.</div></div></div></div></blockquote><div><br></div><div>If the eyeballs are removed the subject can still remember and describe visual experiences while if the entire visual cortex is removed they can't. However, that isn't essential to the argument: the only assumption is that consciousness is due to something in the brain, and then we consider what happens if we partly replace that something with a non-conscious but otherwise normally functioning analogue.</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div class="gmail_extra"><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex"><div dir="ltr"><div><div>
> Now, what happens if you replace the visual cortex with a perfect functional analogue which, however, lacks the special "function" of consciousness?</div></div></div></blockquote><div><br></div><div>If it's the biological visual cortex that generates your consciousness and it is removed and replaced by a electronic visual cortex that does everything just as well as the biological version EXCEPT for generating consciousness then a intelligent conscious being has been turned into a intelligent zombie. </div></div></div></div></blockquote><div><br></div><div>A partial zombie only, since only vision is affected. Consciousness consists of multiple modalities. Blind people are not necessarily zombies.</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div class="gmail_extra"><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex"><div dir="ltr"><div>> Now I hope you can see that if EVERY FUNCTION THAT CAN POSSIBLY BE SCIENTIFICALLY TESTED FOR is incorporated into the artificial visual cortex then it will receive and process input and send output to the rest of the brain in the same way as the original visual cortex;<br></div></div></blockquote><div><br></div><div>Yes.</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex"><div dir="ltr"><div>> the subject will behave completely normally;</div></div></blockquote><div><br></div><div>Yes.</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex"><div dir="ltr"><div>> what would happen if there is a function that can't be scientifically tested for, responsible for visual perception (i.e. consciousness) in the cortex?</div></div></blockquote><div><br></div><div>They're not the same thing, visual perception can be tested for, consciousness of what is perceived can not be. </div></div></div></div></blockquote><div><br></div><div>I use the word "perception" as synonymous with qualia or experience. If a camera is not conscious then I would not say that a camera perceives light. </div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div class="gmail_extra"><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex"><div dir="ltr"><div>> That function would be left out and the subject would be blind</div></div></blockquote><div><br></div><div>If a being responds to light in it's environment then it may or may not be conscious, but it is certainly not blind. </div></div></div></div></blockquote><div><br></div><div>Again, I use the word "blind" to mean lacking in visual qualia. A subject with a damaged visual cortex responds to light to an extent, since the pupils constrict when a light is shone in the eyes, but he does not have any visual perception when this happens.</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div class="gmail_extra"><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex"><div dir="ltr"><div>> but because the artificial visual cortex is sending all the right signals to his speech centres, and every other part of his brain, he doesn't realise he is blind</div></div></blockquote><div><br></div><div>If the biological visual cortex is what generates consciousness and it has been removed then he doesn't realize ANYTHING, he's a zombie. He could still be intelligent witty charming and sexy but he would have no more consciousness than a brick.</div></div></div></div></blockquote><div><br></div>You seem to be assuming that there is a single all modalities<span></span> consciousness mechanism in the brain. What would happen if you took out just a part of this mechanism?<br><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div class="gmail_extra"><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex"><div dir="ltr"><div>> and he still declares that he can see normally.<br></div></div></blockquote><div><br></div><div>Yes because his behavior is unaffected, he said "I can see normally" before the operation so he'd say the same thing after it. </div><div><br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex">> it would be possible to remove a major aspect of a person's consciousness, such as visual perception, but they would behave normally and they would not notice that anything had changed.</blockquote><div><br></div><div>Their bodies would behave just as it always did but they wouldn't notice anything, they're zombies. </div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex"><div dir="ltr"><div>> So do you see the problem with this?<br></div></div></blockquote><div><br></div><div>Yes, the theory that intelligent behavior and consciousness can be separated can't be proven wrong and will never be proven wrong, so the idea is silly, very very silly. </div><div><br></div><div> John K Clark</div><div><br></div><div><br></div><div><br></div><div><br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex"><br></blockquote></div><br></div></div>
</blockquote><br><br>-- <br>Stathis Papaioannou<br>