[ExI] The symbol grounding problem in strong AI
gts_2000 at yahoo.com
Fri Jan 1 16:20:13 UTC 2010
--- On Fri, 1/1/10, Stathis Papaioannou <stathisp at gmail.com> wrote:
>> I'll put an experiment to you, and you tell me what
>> the answer should be:
>> "Please imagine that your brain exists as partly real
>> and partly as an abstract formal description of its former
>> reality, and then report your imagined subjective
>> I hope can appreciate how any reasonable person would
>> consider that question incoherent and even ludicrous. I hope
>> you can also see that from my point of view, you asked me
>> that same question.
> What does "partly as an abstract formal description of its
> former reality" mean?
It means that programs exist as formal descriptions of real or supposed objects or processes. They describe and simulate real objects and real processes but they do not equal them.
> I asked you no such thing.
You did but apparently you didn't understand me well enough to realize it.
> I asked what would happen if a
> surgeon installed in your brain artificial neurons which were
> designed so that they perform the same function as biological neurons.
I have no problem with artificial neurons, per se. I have a problem with the notion that programs that simulate real objects and processes, such as those that exist in your plan for artificial neurons, can have the same sort of reality as the neurological objects and processes they simulate. They can't. You might just as well have asked me to imagine myself as imaginary, whatever that means.
More information about the extropy-chat