[ExI] How not to make a thought experiment
jonkc at bellsouth.net
Mon Feb 1 21:59:53 UTC 2010
On Jan 31, 2010, Gordon Swobe wrote:
> digital models of human brains will have the real properties of natural brains if and only if natural brains already exist as digital objects
You've said that before and when you did I said brains are not important minds are, and minds are digital although they are not objects. To save time and avoid needless wear and tear on electrons the next time you have the urge to repeat that same remark yet again lets adopt the convention of you just saying "41" and my retort to your remark will be "42".
> Philosophers of mind don't care much about how "useful" it may seem.
And that's why philosophers of mind have never produced anything useful and probably never will; computer programers have, mathematicians have, but philosophers of mind not so much.
> They do care if it has a mind capable of having conscious intentional states:
Unfortunately that is all philosophers of mind care about, if they spent just a little time considering what the mind in question actually does regardless of what "intentional state" it is in they would be much more successful. If they spent time taking a high school biology class they would be even better off. But they dislike getting their hands dirty conducting experiments other than the thought kind, and considering actual evidence is even more disagreeable to them.
Darwin contributed astronomically more to understanding what the mind is than any philosopher of mind that ever lived. And these two bit philosophers act as if they've never heard of him; they deserve our contempt.
> Stathis. Looks like you want to skirt the issue by asserting that the system understands things that the man, *considered as the system*, does not understand.
Some might think that it was outrageous enough to propose a thought experiment that contained a room larger than the observable universe and that operated so slowly that the 13.7 billion year age of the universe is not nearly enough time for it to complete a single action, and then to confidently proclaim exactly what this bizarre amalgamation can and cannot understand; but no, Searle was just getting warmed up. Calling his next step ridiculous doesn't capture its true nature, it's more like ridiculous to the ridiculous power.
Piling absurdity on top of absurdity he now wants us to think about a "man" who "internalized" this contraption that is far too large and far too slow to fit in our universe. I don't know what sort of entity could do that and I would be a fool to claim to know what that vastly improbable, something, could and couldn't do, and so would you, and so would Searle. I do know one thing, whatever it is you can bet your life that it isn't a man.
> The system you describe won't really "know" it is red. It will merely act as if it knows it is red
Einstein didn't understand physics he just acted like he understood physics. Tiger Woods didn't understand how to play golf he just acted like he understood how to play golf. I've said it before I'll say it again, understanding is useless!
John K Clark
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat