[ExI] The symbol grounding problem in strong AI

Gordon Swobe gts_2000 at yahoo.com
Mon Dec 28 23:35:10 UTC 2009


--- On Mon, 12/28/09, Stathis Papaioannou <stathisp at gmail.com> wrote:

> Well, I think you've finally understood the problem. If
> indeed there is something in the physics of neurons that is not
> computable, then we won't be able to make artificial neurons based on
> computation that behave like biological neurons. 

It seems you still don't understand the position that you really want (or should want) to refute here. :) 

1) I do not believe anything about the physics of neurons makes them impossible to compute. We can in principle make exact blueprints of real neurons with and on a computer.

2) I believe we can in principle create neurons "based on" those computer blueprints, just as we can make anything from blueprints, and that those manufactured neurons will behave exactly like natural neurons.

3) I do *not* however believe that any neurons you might manufacture that contain computer simulations (i.e., formal programs) in place of the natural processes that correlate with consciousness will act like natural neurons. The reason for this is simple: computer simulations of things do not equal the things they simulate. They contain the forms of things but not the substance of things.

> But Searle claims that weak AI *is* possible.
> He even alludes to Church's thesis to support this:

Yes, and so you think Searle is hoist with his own petard! 

> However, Searle thinks that although the behaviour of the
> brain can be replicated by a computer, the conscious cannot. 

Consciousness can be replicated on a computer in much the same way as a cartoonist replicates it. 

The most simple kind of cartoonist puts a little cloud over his character's head and type words into it to "replicate" consciousness. A more advanced cartoonist will add a time dimension by adding several frames to his cartoon. An even more advanced cartoonist will make a computer animation and add audio to replace the clouds (or simply show the character thinking). A yet even more advanced cartoonist will make his cartoon into a 3-D hologram. At the most sophisticated level the cartoonist will create a perfect computer model of a real brain and insert that baby into his hologram, creating weak AI.

But no matter sophisticated the cartoon gets, it remains just a cartoon. It never magically turns into the real McCoy.

-gts




      



More information about the extropy-chat mailing list