[ExI] How not to make a thought experiment

John Clark jonkc at bellsouth.net
Sat Feb 27 16:43:11 UTC 2010


Since my last post Gordon Swobe has posted 8 times.

> Hard drives and newspapers contain information but they do not have mental states intentional or otherwise

I think he is probably right in this instance, but as Swobe thinks the way these things behave is irrelevant to the question of consciousness one wonders how he knows this. I have asked similar questions before but have never received an answer. Swobe fosters the affect that he is too noble to answer difficult questions about his illogical theory, but I think he doesn't answer because he has no answer.

> On my materialist view, brain matter causes and contains thoughts.

Brains cause thoughts but I don't even know what it means to say they "contain thoughts" as it's meaningless to specify a physical location of a thought. I very much doubt Swobe knows what the phrase means either, the man just types stuff. 

> the noun thought [...] brain matter has mass. So notwithstanding the possible involvement of massless particles, thoughts have mass.

I might have said something like that to parody Swobe's views, but he has saved me the trouble by providing the perfect self parody.

> If you hold that matter can have both physical and non-physical properties, and if you consider mental states (thoughts, beliefs, desires, and so on) as non-physical properties of matter, then you may consider yourself a property dualist.

Does Swobe consider big, small, green, swiftly or the number eleven a non-physical property? I don't know and I doubt if Swobe knows either, the man just types stuff.

> we cannot, in a given human, separate that human's consciousness from his behavior.

But Swobe thinks we can separate a computer's consciousness from his behavior. What is the fundamental reason Swobe thinks the two should be treated so very differently? I don't know and I doubt if Swobe knows either, the man just types stuff.
> 
> 
> But it does not follow that weak AI = strong AI

It would seem to me that very much does follow, why does Swobe think differently? I don't know and I doubt if Swobe knows either, the man just types stuff.

> I do claim that conscious thoughts arise as high-level features of brain matter, yes. The idea seems odd only because we don't yet understand the mechanism.

There is nothing odd about it, the matter is bloody obvious, but then unlike Swobe I happen to think behavior is important. So at the end we are left with the same question we started with, how does Swobe know these things? How does Swobe know consciousness is not a high-level feature of the big toe? I don't know and I doubt if Swobe knows either, the man just types stuff.

I don't expect Swobe to answer this or any other serious question regarding his naive views, rather he will continue with his debating style and simply ignore questions he is afraid of.

 John K Clark 




-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20100227/46a2dcfb/attachment.html>


More information about the extropy-chat mailing list