[ExI] The symbol grounding problem in strong AI.

Gordon Swobe gts_2000 at yahoo.com
Sun Jan 3 17:39:33 UTC 2010


--- On Sat, 1/2/10, John Clark <jonkc at bellsouth.net> wrote:

>> If you think I have an issue with Darwin then either you don't 
>> understand me or you don't understand Darwin. 

> I'm sure emotionally you side with Darwin, but
> you haven't pondered his ideas in any depth because if
> you had you'd know that the idea that consciousness and
> intelligence can be separated and are caused by processes
> that have nothing to do with each other is 100%
> contradictory to Darwin's insight. 

You misunderstand me if you I think I believe consciousness and intelligence exist "separately" in humans or other animals. For most purposes we can consider them near synonyms or at least as handmaidens.

The distinction does however become important in the context of strong AI research. Symbol grounding requires the sort of subjective first-person perspective that evolved in these machines we call humans, and which probably also evolved in other species. If we can duplicate it in software/hardware systems then they can have strong AI. Not really a complicated idea.

-gts




      



More information about the extropy-chat mailing list