<html><head></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; "><br><div><div>On Dec 8, 2009, at 7:04 AM, Gordon Swobe wrote:</div><br class="Apple-interchange-newline"><blockquote type="cite"><div>--- On Tue, 12/8/09, John Clark <<a href="mailto:jonkc@bellsouth.net">jonkc@bellsouth.net</a>> wrote:<br><br><blockquote type="cite">But for some reason this mysterious phase change only<br></blockquote><blockquote type="cite">happens to 3 pounds of grey goo in our head and never<br></blockquote><blockquote type="cite">happens in his Chinese Room. He never explains why.<br></blockquote><br>He explains exactly why in his formal argument: <br><br>Premise A1: Programs are formal (syntactic).<br>Premise A2: Minds have mental contents (semantics). <br>Premise A3: Syntax is neither constitutive of nor sufficient for semantics. <br><br>Ergo, <br><br>Conclusion C1: Programs are neither constitutive of nor sufficient for minds.<br><br>So then Searle gives us at least four targets at which to aim (three premises and the opportunity to deny that his conclusion follows).<br><br>He continues with more formal arguments to defend his philosophy of mind, what he calls biological naturalism, but if C1 doesn't hold then we needn't consider them. <br><br>I came back to ExI after a long hiatus (I have 6000+ unread messages in my ExI mail folder) because I was struck by the fact that Wernicke's aphasia lends support to A3, normally considered the only controversial premise in his argument.<br><br>-gts<br><br><br><br><br><br><br><br><br><br><br><blockquote type="cite">Like most of us here, he subscribes to and<br></blockquote><blockquote type="cite">promotes a species of naturalism. He [Searle] adamantly<br></blockquote><blockquote type="cite">rejects both property and substance dualism. You won't<br></blockquote><blockquote type="cite">find any mystical hocus-pocus in his philosophy.<br></blockquote><blockquote type="cite"><br></blockquote><blockquote type="cite">Bullshit. He thinks intelligent behavior is possible<br></blockquote><blockquote type="cite">without consciousness so evolution could not have produced<br></blockquote><blockquote type="cite">consciousness, no way no how. He has no other explanation<br></blockquote><blockquote type="cite">how it came to be so to explain its existence he has no<br></blockquote><blockquote type="cite">choice but to resort to mystical<br></blockquote><blockquote type="cite">hocus-pocus.<br></blockquote><blockquote type="cite">he allows for the possibility<br></blockquote><blockquote type="cite">of Strong Artificial Intelligence. He just doesn't think<br></blockquote><blockquote type="cite">it possible with formal programs running on hardware. Not<br></blockquote><blockquote type="cite">hardware enough!<br></blockquote><blockquote type="cite"><br></blockquote><blockquote type="cite">So if atoms are arranged in a way that produces a<br></blockquote><blockquote type="cite">human brain those atoms can produce consciousness and if<br></blockquote><blockquote type="cite">arranged as a computer they can too, provided the computer<br></blockquote><blockquote type="cite">doesn't use hardware or software. Don't you find<br></blockquote><blockquote type="cite">that idea just a little bit stupid?<br></blockquote><blockquote type="cite"> John K Clark <br></blockquote><blockquote type="cite">-----Inline Attachment Follows-----<br></blockquote><blockquote type="cite"><br></blockquote><blockquote type="cite">_______________________________________________<br></blockquote><blockquote type="cite">extropy-chat mailing list<br></blockquote><blockquote type="cite"><a href="mailto:extropy-chat@lists.extropy.org">extropy-chat@lists.extropy.org</a><br></blockquote><blockquote type="cite"><a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br></blockquote><blockquote type="cite"><br></blockquote><br><br><br>_______________________________________________<br>extropy-chat mailing list<br><a href="mailto:extropy-chat@lists.extropy.org">extropy-chat@lists.extropy.org</a><br>http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat<br></div></blockquote></div><br></body></html>