[ExI] Meaningless Symbols.
stathisp at gmail.com
Thu Jan 14 01:56:38 UTC 2010
2010/1/14 Gordon Swobe <gts_2000 at yahoo.com>:
> --- On Wed, 1/13/10, Stathis Papaioannou <stathisp at gmail.com> wrote:
>> Hence the point: the system understands even though the
>> parts of it don't. We already knew that was the case, so the CR
>> does not add anything to the discussion.
> Forget about the CR. Neither of us care if parts of the system understand anything. We want to know if the system as a whole knows Chinese from manipulating Chinese symbols according to rules of syntax.
> It cannot, because syntax only tells the system 'what' to put 'where' and 'when'. The system looks at the forms of things, not at the meanings of things.
> Here's the classic one-line program:
> print "Hello World"
> It takes the form
> <syntactic rule><string>
> The system does not understand or care about the semantic drivel you put in the string. It just follows the syntactic rule (and it doesn't care about that either, by the way) and prints the contents of the string.
> Do you think the system understands the string? Do you think that upon running this program, a little conscious entity inside your computer will greet you? Seriously, Stathis, what do you think?
No, because this program is less complex than even a single neuron.
> And by the way the most sophisticated program possible on a s/h system will differ in no philosophically important way from this one.
The problem is that you can't explain how humans get their
understanding. It doesn't help to say that some physical activity
happens in neurons which produces the understanding, not because you
haven't given the details of the physical activity, but because you
haven't explained how, in general terms, it is possible for the
physical activity in a brain to pull off that trick but not the
physical activity in a computer. Even if it's true that computers only
do syntax and syntax can't produce meaning (it isn't, since logically
there is nowhere else for meaning to come from) this does not mean
that computers can't produce meaning. It would be like saying brains
only do chemistry and chemistry can't produce meaning. In the course
of the chemistry brains manipulate symbols and that's where the
meaning comes from if you believe meaning can only come from symbol
manipulation; and in the course of manipulating symbols computers are
physically active and that's where the meaning comes from if you
believe meaning can only come from physical activity.
More information about the extropy-chat