[ExI] The symbol grounding problem in strong AI

Gordon Swobe gts_2000 at yahoo.com
Sun Dec 27 00:40:41 UTC 2009


--- On Sat, 12/26/09, Damien Broderick <thespike at satx.rr.com> wrote:

> Who seriously attributes mind (or consciousness or
> intentionality) to their laptop, or even the world's best
> existing supercomputer? 

Some people do, but thanks for bringing the subject back to strong vs weak.

>> If computers have minds then so do kitchen can-openers
>> and the word loses all meaning.
> 
> That's as ridiculous, in the context of considering a
> hypothetical human-grade AGI machine, as saying "If human
> brains have minds then so do bedbugs and individual
> neurons."

Just by way of clarification: AGI does not require intentionality, at least not as I use the term AGI. Strong AI of the sort that Searle refutes does.

-gts


      



More information about the extropy-chat mailing list