[ExI] The symbol grounding problem in strong AI
Samantha Atkins
sjatkins at mac.com
Tue Dec 22 21:54:59 UTC 2009
On Dec 18, 2009, at 3:09 AM, Ben Zaiboc wrote:
> Gordon Swobe <gts_2000 at yahoo.com> wrote:
>
>> --- On Thu, 12/17/09, Ben Zaiboc <bbenzai at yahoo.com>
>> wrote:
> ...
>
>>> It's "Yes", "Yes", and "What symbol grounding
>> problem?"
>>
>> You'll understand the symbol grounding problem if and when
>> you understand my last sentence, that I did nothing more
>> interesting than does a cartoonist.
>
> LOL. I didn't mean that I don't understand what the 'symbol grounding problem' is, I meant that there is no such problem. This seems to be a pretty fundamental sticking point, so I'll explain my thinking.
>
> We do not know what 'reality' is. There is nothing in our brains that can directly comprehend reality (if that even means anything). What we do is collect sensory data via our eyes, ears, etc., and sift it, sort it, combine it, distort it with preconceptions and past memories, and create 'sensory maps' which are then used to feed the more abstract parts of our minds, to create 'the World according to You'.
Your argument is wanting. What is our sensory experience of if not reality? In what do our senses and mind exist if not in reality? What would "direct comprehension" be, some mystical meandering down fantasy lane? Please explain how any material (i.e., existing or possibly existing) being could apprehend reality *except* through some type of senses and brain creating a map of what is "out there" from sense data. To condemn the only possible form of knowing reality that there can possibly be as actually not knowing reality at all is a bizarre argument.
- samantha
More information about the extropy-chat
mailing list