[ExI] How not to make a thought experiment

John Clark jonkc at bellsouth.net
Fri Feb 19 17:48:48 UTC 2010


Since my last post Gordon Swobe has posted 9 times.

> The CRA thought experiment involves *you the reader* imagining *yourself* in the room (or as the room) using *your* mind to attempt to understand the Chinese symbols.

As is Swobe's habit he is wrong yet again. The Chinese room experiment asks you to imagine yourself as a mechanical relay, that's it. However Swobe is right about one thing, a relay is not conscious. Probably. 

> Conscious awareness

As opposed to unconscious awareness.

> enhances intelligence and gives the organism more flexibility in dealing with multiple stimuli simultaneously.

Consciousness enhances intelligence and changes behavior but the Turing Test cannot detect even a whiff of it. Swobe does not see this idea as being world class stupid. I do.

> As evidence of this we need only look at nature: conscious organisms like humans exhibit more complex and intelligent behaviors than do unconscious organisms like plants and microbes.

This is a very rare occasion where, incredible as it sounds, Swobe is actually correct. Another way to express Swobe's words quoted above is to say "The Turing Test works".

> you assume here as you do that that the i/o behavior of a brain/neuron is all there is to the brain. [...]
> consciousness may involve the electrical signals that travel down the axons internal to the neurons

Swobe is always keen to tell us that nobody including him has any idea what causes consciousness, so it is equally likely consciousness may involve the size of one's foot, after all the only being Swobe knows with certainty to be conscious has one particular shoe size. I am not trying to be funny, it's easy to demonstrate that the brain and neurons have something to do with intelligence but, if as Swobe believes, that has nothing to do with consciousness then the organ that is the seat of awareness is anybody's guess. The foot is as good a guess as any other.

> Let us say that we created an artificial brain that contained a cubic foot of warm leftover mashed potatoes and gravy[...]  Would your mister potato-head have consciousness?

Swobe says he loves the Chinese room crapola because it can objectively determine what is conscious and what is not, and yet when he tries to defend this ridiculous idea he repeatedly dreams up intelligent things that are "obviously" not conscious, such as a computer made of toilet paper and now one made of mashed potatoes and gravy. But if all of this is obvious Swobe does not make it clear what in hell the point of the Chinese room thought experiment is. 

And Swobe may be interested to know that his brain is in fact the product of last years mashed potatoes and gravy, it's just a question of rearranging the atoms in a programable way. DNA does exactly that.

> I think we can and will one day create unconscious robots that *act* like they have consciousness. 

Swobe thinks humans can make a environment that produces a being that acts like he's conscious, but only God [or various euphemisms of that word] can create an environment that makes the real deal. I disagree.

> You should consider him [ the Chinese room dude] an actual man  [...]  I wanted to encourage you to consider the man as literally a man

Swobe says we should consider the Chinese room fellow as literally a man, a man who can live for many trillions of years and "internalize" that book of instructions, a actual man who can memorize a document far larger than the observable universe. I say that remark is idiotic. Does anyone care to dispute my criticism?

> our man in the room has no understanding of any symbols and so no knowledge base to build on.

Wow, now I see the error of my ways! It's a pity Swobe didn't say that two months and several hundred posts ago, think of the time we could have saved. Oh wait he did.

> He can do no more than follow the syntactic instructions in the program: if input = "squiggle" then output "squoogle". 

Wow, now I see the error of my ways! It's a pity Swobe didn't say that two months and several hundred posts ago, think of the time we could have saved. Oh wait he did.

> syntactic order does not give understanding.

Wow, now I see the error of my ways! It's a pity Swobe didn't say that two months and several hundred posts ago, think of the time we could have saved. Oh wait he did.

> formal syntax does not give semantics

Wow, now I see the error of my ways! It's a pity Swobe didn't say that two months and several hundred posts ago, think of the time we could have saved. Oh wait he did.

 John K Clark

>  
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20100219/d1c37ecf/attachment.html>


More information about the extropy-chat mailing list