[ExI] The digital nature of brains (was: digital simulations)
jonkc at bellsouth.net
Tue Jan 26 16:50:53 UTC 2010
Since my last post Gordon Swobe has written 4.
> We, but not computers, have the ability to hold the meanings of symbols in our minds as intentional objects, and to processs those meanings consciously as you do at this very moment.
>> You can't explain what this meaning is
> I just did.
What you just said was that the meaning of meaning is the ability to hold meanings in our minds. And round and round we go.
> If I met an entity on the street that passed the TT, I would not know if that entity had semantics. However if I also knew that entity ran only formal programs then I would know from analytic arguments that it did not.
Your "analysis" goes like this: I Gordon Swobe fail to see how intelligence can produce consciousness and the only possible explanation for my failure is that Darwin was wrong and intelligence can not produce consciousness. I mean, I'm Gordon Swobe, what other explanation for my failure to see a connection could there possibly be?
> You agree that your software/hardware system does not have conscious understanding of symbols (semantics) but you also argue that digital computers can have it. Let me ask you: what would it take for your desktop computer to acquire this capacity that you insist it could have but does not have?
> [long tedious thought experiment] .... Will it then have conscious understanding the meaning of W? No. The human operator will understand W, but s/h systems have no means of attaching meanings to symbols. The system followed purely syntactic rules to make all those hundreds of millions of associations without ever understanding them. It cannot get semantics from syntax.
As is your custom in thought experiments you simply declare what you are trying to prove. I really don't understand why you don't just keep the declarations and skip the thought experiment, it would save a lot of time.
John K Clark
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat