[ExI] The digital nature of brains (was: digital simulations)

Stathis Papaioannou stathisp at gmail.com
Wed Jan 27 23:04:05 UTC 2010


On 28 January 2010 00:16, Gordon Swobe <gts_2000 at yahoo.com> wrote:

>>> When the program finishes, the system will have made
>>> every possible meaningful association of W to other words.
>>> Will it then have conscious understanding the meaning of W?
>>> No. The human operator will understand W, but s/h systems
>>> have no means of attaching meanings to symbols. The system
>>> followed purely syntactic rules to make all those hundreds
>>> of millions of associations without ever understanding them.
>>> It cannot get semantics from syntax.
>
> You replied:
>
>> But if you put a human in place of the computer doing the
>> same thing he won't understand the symbols either, no matter how
>> intelligent he is.
>
> Absolutely right!!
>
> My example above works like the Chinese room, but with the languages reversed. We can imagine a Chinese man operating the word-association program inside the Cray computer. He will manipulate the English symbols according to the syntactic rules specified by the program, and he will do so in ways that appear meaningful to an English-speaking human operator, but he will never come to understand the English symbols. He cannot get semantics from syntax.
>
> This represents progress, because you argued just a few days ago that perhaps people and also computers really do get semantics from syntax. Now I think you that they do not.
>
> It looks like you agree with the third premise:
>
> A3: Syntax is neither constitutive of nor sufficient for semantics.'
>
> We might also add this corollary:
>
> 'A3a: The mere syntactic association of symbols is not sufficient for semantics'
>
> These truths are pretty easy to see, and now you see them.

I'm afraid I don't agree. The man in the room doesn't understand the
symbols, the matter in the computer doesn't understand the symbols,
but the process of computing *does* understand the symbols.

Look at it this way: you understand the symbols, but you can't see how
the understanding comes from syntax. You think it's impossible. But it
looks even more impossible that the understanding should come from
matter. I could make the statement "matter is neither constitutive nor
sufficient for semantics". You don't have any answer to that other
than to point to a brain and say it has understanding. But I can point
to a brain say that it has understanding by virtue of the information
processing it does. From your point of view a miracle has to occur in
either case, but at least with the computational explanation we are in
the same ballpark as symbols, semantics and syntax are all to do with
information, while matter and meaning are utterly different things.
And if you still stubbornly insist that it's the matter that has the
understanding, then you can say that it's the matter in the computer
responsible.


-- 
Stathis Papaioannou



More information about the extropy-chat mailing list