[ExI] The digital nature of brains (was: digital simulations)

Gordon Swobe gts_2000 at yahoo.com
Thu Jan 28 19:05:00 UTC 2010


--- On Wed, 1/27/10, Stathis Papaioannou <stathisp at gmail.com> wrote:
 
>> When the program finishes, the system will
>> have made every possible meaningful association of W to
>> other words. Will it then have conscious understanding of the
>> meaning of W? No. The human operator will understand W but
>> s/h systems have no means of attaching meanings to
>> symbols. The system followed purely syntactic rules to make all
>> those hundreds of millions of associations without ever
>> understanding them. It cannot get semantics from syntax.

I name my dictionary-word-association-program s/h system above "DWAP".

> I'm afraid I don't agree. The man in the room doesn't
> understand the symbols, the matter in the computer doesn't understand 
> the symbols, but the process of computing *does* understand the
> symbols.

You lost me there. Either DWAP has conscious understanding of W (in which case it 'has semantics'), or else DWAP does not have conscious understanding of W.

First you agreed with me that DWAP does not have semantics, and you also made the excellent observation that a human who performed the same syntactic operations on English symbols would also not obtain conscious understanding of the symbols merely by virtue of having performed those operations. It would take something else, you said. 

But now it seems that you've reneged. Now you want to say that DWAP has semantics? I think you had it right the first time.

So let me ask you again in clear terms: 

Does DWAP have conscious understanding of W? Or not? 

And would a human non-English-speaker obtain conscious understanding of W from performing the same syntactic operations as did DWAP? Or not?

-gts


      



More information about the extropy-chat mailing list