[ExI] The symbol grounding problem in strong AI

Gordon Swobe gts_2000 at yahoo.com
Sun Dec 20 22:15:57 UTC 2009


--- On Sun, 12/20/09, Aware <aware at awareresearch.com> wrote:

> There is no essential consciousness to be explained, but there is the
> very real phenomenon of self-awareness, rife with gaps, distortions,
> delays and confabulation, displayed by many adapted organisms,
> conferring obvious evolutionary advantages in terms of the agent
> modeling its /self/ within its environment of interaction.  

More to the point, we have this phenomenon to which I referred in the title of the thread: symbol grounding.

Frankly for all I really care consciousness does not exist. But symbol grounding does seem to happen by some means. The notion of consciousness seems to help explain it but it doesn't matter. If we cannot duplicate symbol grounding in programs then it seems we can't have strong AI in S/H systems.


-gts


      



More information about the extropy-chat mailing list