[ExI] The symbol grounding problem in strong AI

Jeff Davis jrd1415 at gmail.com
Mon Dec 21 20:02:08 UTC 2009


On Sun, Dec 20, 2009 at 5:44 PM, Gordon Swobe <gts_2000 at yahoo.com> wrote:

>... How can a program get semantics?

Earlier in this thread, Gordon enumerated the following points,:

1) Programs are formal (syntactic)
2) Minds have mental contents (semantics)
3) Syntax is neither constitutive of nor sufficient for semantics
4) Programs are neither constitutive of nor sufficient for minds.

and then challenged interested parties thusly: "If you think you see a
logical problem then show it to me."

Let me give it a shot.  With one caveat: Searle may have had a good
deal more to say beyond the above four points, and as I am not well
versed in that additional context, there's ample room for error on my
part.

First let me restate the four points somewhat more bluntly:

1) Programs are about rules, not meanings,
2) Minds have contents that have meanings,
3) Rules won't generate meanings
4) So programs can't generate minds.

Okay.

Programs don't exist in a vacuum, they RUN on a suitable substrate.
That substrate has information in both its structure and storage
devices.  When a program runs, it generates output, which modifies and
increases its information content.  Eventually, the inherent
functionality of the program, combined with the initial information
(time zero data set) and ongoing external inputs, generates the
necessary semantics/meanings.

I offer as an example the human sperm and egg.  They appear to lack
mind and meaning, but they have information, structure, and
functionality.  Combine the sperm and the egg, and they can produce a
human infant.  In regard to mind and meaning, that infant starts out a
blank -- setting aside the probable substantial inventory of
genetically-encoded meanings in the form of instinctive mental
behaviors.  That syntax-equipped-yet-semantically-blank proto-mind
will then proceed to absorb additional inputs and generate the full
complement of screwed-up semantics that we see in the typical mature
human.

Searle's argument seems little more than another attempt -- born of
ooga-booga spirituality -- to deny the basic truth of materialism:
Life, persona, mind, and consciousness are the entirely unspecial
result of the "bubble, bubble, toil, and trouble" of stardust in the
galactic cauldron.  When life, persona, mind, consciousness are
eventually deconstructed, they will be seen to be as mundane as the
dirt from which they sprang.  Some may find this a bad thing.  I
consider it, already, immensely liberating.

Best, Jeff Davis

  "Everything's hard till you know how to do it."
                          Ray Charles



More information about the extropy-chat mailing list