[ExI] How not to make a thought experiment

John Clark jonkc at bellsouth.net
Tue Feb 23 16:24:43 UTC 2010


Since my last post Gordon Swobe has posted 4 times.

> On computer B the simulated brain simulates the processing of simulated thoughts

So now we have a simulation of a simulation of a simulation, aren't Swobe's thought experiments grand, he always makes things so crystal clear. 

> but here you do object when I tell you the simulated brain doesn't really think. Here you want to tell me the simulated brain really does think real thoughts.

Simulated thoughts differ from real thoughts in exactly the same way simulated arithmetic differs from real arithmetic, not at all.

> What's a simulated thought, you ask? Here's a famous one that I've mentioned before:
> print "Hello World"
> It doesn't get any better than that

That's what I thought and if that's the best example Swobe can come up with then the concept is empty.

> it [a computer] should really think real conscious thoughts like you and me. 

If Swobe was consistent and played by his own rules he would ascribe consciousness only to himself not to other people, but of course consistency is not his strong suit. 

> More precisely, why do you classify "thoughts" in a different category than you do "blood" and "food"

What a incredibly stupid question! If thoughts do not belong in a separate category from blood or food then what the hell is the point in having categories?

> you have adopted a dualistic world-view in which mental phenomena fall into a different category than do ordinary material entities

Well of course I have as would any sane person.

> I can write a program today that will make a computer act conscious to a limited degree.

I'm sure he can, but Swobe would find it much more difficult to write a program that was intelligent even to a limited degree, he would find that consciousness is easy but intelligence is hard just as Evolution did. 

> With enough knowledge and resources I could write one that fooled you into thinking it had consciousness -- that caused a computer or robot to behave in such a way that it passed the Turing test.

Swobe is saying that if he knew how to make an AI then he could make an AI if he had a lot of money. I could be wrong but I believe there may be others who could make a similar claim.

> I don't understand why you should even question the separability of behavior and consciousness.

I'm not surprised Swobe doesn't understand, there is much in the natural world that is baffling if one is totally ignorant of Darwin's Theory of Evolution.

 John K Clark








-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20100223/ee70dc1b/attachment.html>


More information about the extropy-chat mailing list