[ExI] The symbol grounding problem in strong AI
Gordon Swobe
gts_2000 at yahoo.com
Thu Dec 17 01:35:31 UTC 2009
--- On Wed, 12/16/09, John Clark <jonkc at bellsouth.net> wrote:
>> I take a huge flying leap of faith and assume that John
> Clark's brain can think too.
> The problem is that you're willing to make that
> huge leap of faith for me but not for a computer, you'll
> do it for meat but not for silicon.
I would like to do it for my computer, John, but you will need first to show me a flaw in Searle's formal argument.
I offer it yet again in answer to your words above:
Because
1) Programs are formal (syntactic)
and because
2) Minds have mental contents (semantics)
and because
3) Syntax is neither constitutive of nor sufficient for semantics
It follows that
4) Programs are neither constitutive of nor sufficient for minds.
If you think you see a logical problem then show it to me.
-gts
More information about the extropy-chat
mailing list