[ExI] Wernicke's aphasia and the CRA.

John Clark jonkc at bellsouth.net
Tue Dec 8 15:35:30 UTC 2009


On Dec 8, 2009, at 7:04 AM, Gordon Swobe wrote:

> --- On Tue, 12/8/09, John Clark <jonkc at bellsouth.net> wrote:
> 
>> But for some reason this mysterious phase change only
>> happens to 3 pounds of grey goo in our head and never
>> happens in his Chinese Room. He never explains why.
> 
> He explains exactly why in his formal argument: 
> 
> Premise A1: Programs are formal (syntactic).
> Premise A2: Minds have mental contents (semantics). 
> Premise A3: Syntax is neither constitutive of nor sufficient for semantics. 
> 
> Ergo, 
> 
> Conclusion C1: Programs are neither constitutive of nor sufficient for minds.
> 
> So then Searle gives us at least four targets at which to aim (three premises and the opportunity to deny that his conclusion follows).
> 
> He continues with more formal arguments to defend his philosophy of mind, what he calls biological naturalism, but if C1 doesn't hold then we needn't consider them. 
> 
> I came back to ExI after a long hiatus (I have 6000+ unread messages in my ExI mail folder) because I was struck by the fact that Wernicke's aphasia lends support to A3, normally considered the only controversial premise in his argument.
> 
> -gts
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
>> Like most of us here, he subscribes to and
>> promotes a species of naturalism. He [Searle] adamantly
>> rejects both property and substance dualism. You won't
>> find any mystical hocus-pocus in his philosophy.
>> 
>> Bullshit. He thinks intelligent behavior is possible
>> without consciousness so evolution could not have produced
>> consciousness, no way no how. He has no other explanation
>> how it came to be so to explain its existence he has no
>> choice but to resort to mystical
>> hocus-pocus.
>> he allows for the possibility
>> of Strong Artificial Intelligence. He just doesn't think
>> it possible with formal programs running on hardware. Not
>> hardware enough!
>> 
>> So if atoms are arranged in a way that produces a
>> human brain those atoms can produce consciousness and if
>> arranged as a computer they can too, provided the computer
>> doesn't use hardware or software. Don't you find
>> that idea just a little bit stupid?
>>  John K Clark 
>> -----Inline Attachment Follows-----
>> 
>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>> 
> 
> 
> 
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20091208/903becc9/attachment.html>


More information about the extropy-chat mailing list