[ExI] How not to make a thought experiment

Gordon Swobe gts_2000 at yahoo.com
Sun Feb 21 14:44:25 UTC 2010


A disturbed person who goes by the name John K Clark continues to post 
messages to this discussion thread in what appears as an ongoing 
conversation with himself about me. 

He uses my name excessively in a bid to broadcast his confused 
thoughts to the search engines. His obsessive actions amount to a malicious attempt slander me and mischaracterize my views. 

I have no association with John K Clark.

Gordon Swobe 

top posting in self defense



> --- On Fri, 2/19/10, John Clark <jonkc at bellsouth.net>
> wrote:



> >> I think consciousness aids and enhances
> intelligence, something like the way a flashlight helps one
> move about in the dark.
> 
> I've said this many many times before but that doesn't
> prevent it from being true, despite believing the above, in
> a magnificent demonstration of doublethink, Swobe also
> believes that a behavioral demonstration like the Turing
> test cannot detect consciousness. 
> 
> > It seems probable to me that conscious intelligence
> involves less biological overhead than instinctive
> unconscious intelligence
> 
> Then the logical implication is crystal clear, its harder
> to make an unconscious intelligence than a conscious
> intelligence. So if you encounter an intelligent machine
> your default assumption should be that it is conscious. 
> 
> > especially when considering complex behaviors such as
> social behaviors. Perhaps nature selected it for that reason
> only.
> 
> So if Swobe met a robot with greater social intelligence
> than he has would he consider it conscious? No of course he
> would not because, because,.... well just because.
> 
> Actually that is what Swobe would say today but I don't
> think that's what would really happen. If someone ever met
> up with such a machine I think it would understand us so
> well, better than we understand ourselves, that it could
> convince anyone to believe in anything and could quite
> literally charm the pants off us. As Swobe points out, even
> today characters in video games seem to be conscious to
> some, a robot with a Jupiter Brain would convince even the
> most sophisticated among us. We would believe the robot was
> conscious even if we couldn't prove it. I have the same
> belief regarding Gordon Swobe and the same lack of a proof.
> 
>  John K Clark
> 
> Since my last post Gordon Swobe has posted 9 times.
> 
> > The CRA thought experiment involves *you the reader*
> imagining *yourself* in the room (or as the room) using
> *your* mind to attempt to understand the Chinese symbols.
> 
> As is Swobe's habit he is wrong yet again. The Chinese room
> experiment asks you to imagine yourself as a mechanical
> relay, that's it. However Swobe is right about one thing, a
> relay is not conscious. Probably. 
> 
> > Conscious awareness
> 
> As opposed to unconscious awareness.
> 
> > enhances intelligence and gives the organism more
> flexibility in dealing with multiple stimuli
> simultaneously.
> 
> Consciousness enhances intelligence and changes behavior
> but the Turing Test cannot detect even a whiff of it. Swobe
> does not see this idea as being world class stupid. I do.
> 
> > As evidence of this we need only look at nature:
> conscious organisms like humans exhibit more complex and
> intelligent behaviors than do unconscious organisms like
> plants and microbes.
> 
> This is a very rare occasion where, incredible as it
> sounds, Swobe is actually correct. Another way to express
> Swobe's words quoted above is to say "The Turing Test
> works".
> 
> > you assume here as you do that that the i/o behavior
> of a brain/neuron is all there is to the brain. [...]
> > consciousness may involve the electrical signals that
> travel down the axons internal to the neurons
> 
> Swobe is always keen to tell us that nobody including him
> has any idea what causes consciousness, so it is equally
> likely consciousness may involve the size of one's foot,
> after all the only being Swobe knows with certainty to be
> conscious has one particular shoe size. I am not trying to
> be funny, it's easy to demonstrate that the brain and
> neurons have something to do with intelligence but, if as
> Swobe believes, that has nothing to do with consciousness
> then the organ that is the seat of awareness is anybody's
> guess. The foot is as good a guess as any other.
> 
> > Let us say that we created an artificial brain that
> contained a cubic foot of warm leftover mashed potatoes and
> gravy[...]  Would your mister potato-head have
> consciousness?
> 
> Swobe says he loves the Chinese room crapola because it can
> objectively determine what is conscious and what is not, and
> yet when he tries to defend this ridiculous idea he
> repeatedly dreams up intelligent things that are "obviously"
> not conscious, such as a computer made of toilet paper and
> now one made of mashed potatoes and gravy. But if all of
> this is obvious Swobe does not make it clear what in hell
> the point of the Chinese room thought experiment is. 
> 
> And Swobe may be interested to know that his brain is in
> fact the product of last years mashed potatoes and gravy,
> it's just a question of rearranging the atoms in a
> programable way. DNA does exactly that.
> 
> > I think we can and will one day create unconscious
> robots that *act* like they have consciousness. 
> 
> Swobe thinks humans can make a environment that produces a
> being that acts like he's conscious, but only God [or
> various euphemisms of that word] can create an environment
> that makes the real deal. I disagree.
> 
> > You should consider him [ the Chinese room dude] an
> actual man  [...]  I wanted to encourage you to
> consider the man as literally a man
> 
> Swobe says we should consider the Chinese room fellow as
> literally a man, a man who can live for many trillions of
> years and "internalize" that book of instructions, a actual
> man who can memorize a document far larger than the
> observable universe. I say that remark is idiotic. Does
> anyone care to dispute my criticism?
> 
> > our man in the room has no understanding of any
> symbols and so no knowledge base to build on.
> 
> Wow, now I see the error of my ways! It's a pity Swobe
> didn't say that two months and several hundred posts ago,
> think of the time we could have saved. Oh wait he did.
> 
> > He can do no more than follow the syntactic
> instructions in the program: if input = "squiggle" then
> output "squoogle". 
> 
> Wow, now I see the error of my ways! It's a pity Swobe
> didn't say that two months and several hundred posts ago,
> think of the time we could have saved. Oh wait he did.
> 
> > syntactic order does not give understanding.
> 
> Wow, now I see the error of my ways! It's a pity Swobe
> didn't say that two months and several hundred posts ago,
> think of the time we could have saved. Oh wait he did.
> 
> > formal syntax does not give semantics
> 
> Wow, now I see the error of my ways! It's a pity Swobe
> didn't say that two months and several hundred posts ago,
> think of the time we could have saved. Oh wait he did.
> 
>  John K Clark
> Since my last post Gordon Swobe has posted 10 times.
> >
> > Come the singularity, some people will lose their
> grips on reality and find themselves believing such
> absurdities as that digital depictions of people have real
> mental states. A few lonely philosophers of my stripe will
> try in vain to restore their sanity.
> 
> As far as the future is concerned it really doesn't matter
> if Swobe's ideas are right or wrong, either way they're as
> dead as the Dodo. Even if he's 100% right and I am 100%
> wrong people with my ideas will have vastly more influence
> than people like him because we will not be held back by
> superstitious ideas about "THE ORIGINAL". So it's pedal to
> the metal upgrading, Jupiter brain ahead. Swobe just won't
> be  able to keep up with the electronic competition.
> Only a few axons in the brain can send signals as fast as
> 100 meters per second, non-myelinated axon's are only able
> to go about 1 meter per second. Light moves at 300,000,000
> meters per second.
> 
> Perhaps after the singularity the more conservative and
> superstitious among us could still survive in some little
> backwater somewhere, like the Amish do today, but I doubt
> it.
> 
> > I think you want me to believe that my watch has a
> small amount of consciousness by virtue of it having a small
> amount of intelligence. But I don't think that makes even a
> small amount of sense. It seems to me that my watch has no
> consciousness
> 
> I'm not surprised Swobe can't make sense of it all, nothing
> in the Biological sciences makes any sense without
> Evolution, and he has shown a profound ignorance not only of
> that theory but of the fossil record in general. Evolution
> found it far harder to come up with intelligence than
> consciousness, the brain structures that produce the basic
> emotions we share with many other animals and are many
> hundreds of millions of years old, while the higher brain
> structures that produce language, mathematics and abstract
> thought in general, things that make humans unique, are less
> than a million years old and possibly much less. Swobe does
> not use his higher brain structures to think with and
> prefers to think with his gut; but many animals have an
> intestinal tract and to my knowledge none of them are
> particularly good philosophers.
> 
> > Consciousness, as I mean it today, entails the ability
> to have conscious intentional states. That is, it entails
> the ability to have something consciously "in mind"
> 
> So consciousness means the ability to be conscious, that is
> to say the ability to consciously think about stuff. Thank
> you so much for those words of wisdom!
>   
> > If I make a jpeg of you with my digital camera, that
> digital depiction of you will have no mental states.
> 
> Swobe may very well be right in this particular instance,
> but it illustrates the useless nature of the grotesque
> exercises he gives the grandiose name "thought experiment".
> Swobe has no way to directly measure the mental states even
> of his fellow human beings much less that of a digital
> camera; and yet over the last few months he has made grand
> pronouncements about the mental states of literally hundreds
> of things. To add insult to injury the mental state of
> things is exactly what he's trying to prove; he just doesn't
> understand that saying X has no consciousness is not the
> same as proving X has no consciousness.
> 
> > The idea is that while a person doesn't understand
> Chinese, somehow the conjunction of that person and bits of
> paper might understand Chinese. It is not easy for me to
> imagine how someone who was not in the grip of an ideology
> would find the idea at all plausible. 
> 
> Swobe admits, and if fact seems delighted by the fact, that
> he has absolutely no idea what causes consciousness;
> nevertheless he thinks he can always determine a priori what
> has consciousness and what does not, and it has nothing to
> do with the way they behave. The conjunction of a person
> with bits of paper might display intelligence, in fact there
> is no doubt that it could, but it could never be conscious
> because, because, well just because; but Swobe thinks 3
> pounds of grey goo being conscious  is perfectly
> logical. Can Swobe explain why one thing is ridiculous and
> the other logical? Nope, it's just that he's accustomed to
> one and not the other. That's it. 
> 
> > Depictions of things, digital or otherwise, do not
> equal the things they depict
> 
> Wow, now I see the error of my ways! It's a pity Swobe
> didn't say that two months and several hundred posts ago,
> think of the time we could have saved. Oh wait he did.
> 
> > the man cannot grok the symbols by virtue of
> manipulating them according to the rules of syntax 
> 
> > Wow, now I see the error of my ways! It's a pity Swobe
> didn't say that two months and several hundred posts ago,
> think of the time we could have saved. Oh wait he did.
> 
> > Depictions of things do not equal the things they
> depict.
> 
> Wow, now I see the error of my ways! It's a pity Swobe
> didn't say that two months and several hundred posts ago,
> think of the time we could have saved. Oh wait he did.
> 
> Gordon Swobe
> Gordon Swobe
> Gordon Swobe
> Gordon Swobe
> Gordon Swobe
> Gordon Swobe
> 
> 
> 
> --- On Fri, 2/19/10, John Clark <jonkc at bellsouth.net>
> wrote:
> 
> > From: John Clark <jonkc at bellsouth.net>
> > Subject: How not to make a thought experiment
> > To: gordon.swobe at yahoo.com,
> "ExI chat list" <extropy-chat at lists.extropy.org>
> > Date: Friday, February 19, 2010, 12:48 PM
> > Since my last
> > post Gordon Swobe has posted 9 times.
> > The CRA thought
> > experiment involves *you the reader* imagining
> *yourself* in
> > the room (or as the room) using *your* mind to attempt
> to
> > understand the Chinese symbols.
> > As is Swobe's habit he is wrong yet again.
> > The Chinese room experiment asks you to imagine
> yourself as
> > a mechanical relay, that's it. However Swobe is right
> > about one thing, a relay is not conscious.
> > Probably. 
> > Conscious
> > awareness
> > As opposed to unconscious awareness.
> > enhances intelligence and
> > gives the organism more flexibility in dealing with
> multiple
> > stimuli simultaneously.
> > Consciousness enhances intelligence and changes
> > behavior but the Turing Test cannot detect even a
> whiff of
> > it. Swobe does not see this idea as being world
> class
> > stupid. I do.
> > As evidence of this we
> > need only look at nature: conscious organisms like
> humans
> > exhibit more complex and intelligent behaviors than
> do
> > unconscious organisms like plants and microbes.
> > 
> > This is a very rare occasion where, incredible
> > as it sounds, Swobe is actually correct. Another way
> to
> > express Swobe's words quoted above is to say "The
> > Turing Test works".
> > you assume here as you
> > do that that the i/o behavior of a brain/neuron is all
> there
> > is to the brain. [...]consciousness may
> > involve the electrical signals that travel down the
> axons
> > internal to the neurons
> > Swobe is always keen to tell us that nobody
> > including him has any idea what causes consciousness,
> so it
> > is equally likely consciousness may involve the size
> of
> > one's foot, after all the only being Swobe knows with
> > certainty to be conscious has one particular shoe
> size. I am
> > not trying to be funny, it's easy to demonstrate that
> > the brain and neurons have something to do with
> intelligence
> > but, if as Swobe believes, that has nothing to do
> with
> > consciousness then the organ that is the seat of
> awareness
> > is anybody's guess. The foot is as good a guess as
> any
> > other.
> > Let us say that we
> > created an artificial brain that contained a cubic
> foot of
> > warm leftover mashed potatoes and gravy[...]  Would
> > your mister potato-head have
> > consciousness?
> > Swobe says he loves the Chinese room crapola because
> > it can objectively determine what is conscious and
> what is
> > not, and yet when he tries to defend this ridiculous
> idea he
> > repeatedly dreams up intelligent things that are
> > "obviously" not conscious, such as a computer made
> > of toilet paper and now one made of mashed potatoes
> and
> > gravy. But if all of this is obvious Swobe does not
> make it
> > clear what in hell the point of the Chinese room
> thought
> > experiment is. 
> > And Swobe may be interested to know that his
> > brain is in fact the product of last years mashed
> > potatoes and gravy, it's just a question of
> rearranging
> > the atoms in a programable way. DNA does exactly
> that.
> > I think we can and will
> > one day create unconscious robots that *act* like they
> have
> > consciousness. 
> > Swobe thinks humans can make a environment that
> > produces a being that acts like he's conscious, but
> only
> > God [or various euphemisms of that word] can create
> an
> > environment that makes the real deal. I
> > disagree.
> > You should consider
> > him [ the Chinese room dude] an actual man
> >  [...]  I wanted to encourage you to consider
> > the man as literally a man
> > Swobe says we should consider the Chinese
> > room fellow as literally a man, a man who can live for
> many
> > trillions of years and "internalize" that book of
> > instructions, a actual man who can memorize a document
> far
> > larger than the observable universe. I say that remark
> is
> > idiotic. Does anyone care to dispute my
> > criticism?
> > our man in the
> > room has no understanding of any symbols and so no
> knowledge
> > base to build on.
> > Wow, now I see the error of my ways! It's a
> > pity Swobe didn't say that two months and several
> > hundred posts ago, think of the time we could have
> saved. Oh
> > wait he did.
> > He can do no more than
> > follow the syntactic instructions in the program: if
> input =
> > "squiggle" then output
> > "squoogle". 
> > Wow, now I see the error of my ways! It's a
> > pity Swobe didn't say that two months and several
> > hundred posts ago, think of the time we could have
> saved. Oh
> > wait he did.
> > syntactic order does not
> > give understanding.
> > Wow, now I see the error of my ways! It's a
> > pity Swobe didn't say that two months and several
> > hundred posts ago, think of the time we could have
> saved. Oh
> > wait he did.
> > formal syntax does
> > not give semantics
> > Wow, now I see the error of my ways!
> > It's a pity Swobe didn't say that two months and
> > several hundred posts ago, think of the time we could
> have
> > saved. Oh wait he did.
> >  John K Clark
> >  
> 
> 
> 
> 

--- On Fri, 2/19/10, John Clark <jonkc at bellsouth.net> wrote:

> From: John Clark <jonkc at bellsouth.net>
> Subject: How not to make a thought experiment
> To: gordon.swobe at yahoo.com, "ExI chat list" <extropy-chat at lists.extropy.org>
> Date: Friday, February 19, 2010, 12:48 PM
> Since my last
> post Gordon Swobe has posted 9 times.
> The CRA thought
> experiment involves *you the reader* imagining *yourself* in
> the room (or as the room) using *your* mind to attempt to
> understand the Chinese symbols.
> As is Swobe's habit he is wrong yet again.
> The Chinese room experiment asks you to imagine yourself as
> a mechanical relay, that's it. However Swobe is right
> about one thing, a relay is not conscious.
> Probably. 
> Conscious
> awareness
> As opposed to unconscious awareness.
> enhances intelligence and
> gives the organism more flexibility in dealing with multiple
> stimuli simultaneously.
> Consciousness enhances intelligence and changes
> behavior but the Turing Test cannot detect even a whiff of
> it. Swobe does not see this idea as being world class
> stupid. I do.
> As evidence of this we
> need only look at nature: conscious organisms like humans
> exhibit more complex and intelligent behaviors than do
> unconscious organisms like plants and microbes.
> 
> This is a very rare occasion where, incredible
> as it sounds, Swobe is actually correct. Another way to
> express Swobe's words quoted above is to say "The
> Turing Test works".
> you assume here as you
> do that that the i/o behavior of a brain/neuron is all there
> is to the brain. [...]consciousness may
> involve the electrical signals that travel down the axons
> internal to the neurons
> Swobe is always keen to tell us that nobody
> including him has any idea what causes consciousness, so it
> is equally likely consciousness may involve the size of
> one's foot, after all the only being Swobe knows with
> certainty to be conscious has one particular shoe size. I am
> not trying to be funny, it's easy to demonstrate that
> the brain and neurons have something to do with intelligence
> but, if as Swobe believes, that has nothing to do with
> consciousness then the organ that is the seat of awareness
> is anybody's guess. The foot is as good a guess as any
> other.
> Let us say that we
> created an artificial brain that contained a cubic foot of
> warm leftover mashed potatoes and gravy[...]  Would
> your mister potato-head have
> consciousness?
> Swobe says he loves the Chinese room crapola because
> it can objectively determine what is conscious and what is
> not, and yet when he tries to defend this ridiculous idea he
> repeatedly dreams up intelligent things that are
> "obviously" not conscious, such as a computer made
> of toilet paper and now one made of mashed potatoes and
> gravy. But if all of this is obvious Swobe does not make it
> clear what in hell the point of the Chinese room thought
> experiment is. 
> And Swobe may be interested to know that his
> brain is in fact the product of last years mashed
> potatoes and gravy, it's just a question of rearranging
> the atoms in a programable way. DNA does exactly that.
> I think we can and will
> one day create unconscious robots that *act* like they have
> consciousness. 
> Swobe thinks humans can make a environment that
> produces a being that acts like he's conscious, but only
> God [or various euphemisms of that word] can create an
> environment that makes the real deal. I
> disagree.
> You should consider
> him [ the Chinese room dude] an actual man
>  [...]  I wanted to encourage you to consider
> the man as literally a man
> Swobe says we should consider the Chinese
> room fellow as literally a man, a man who can live for many
> trillions of years and "internalize" that book of
> instructions, a actual man who can memorize a document far
> larger than the observable universe. I say that remark is
> idiotic. Does anyone care to dispute my
> criticism?
> our man in the
> room has no understanding of any symbols and so no knowledge
> base to build on.
> Wow, now I see the error of my ways! It's a
> pity Swobe didn't say that two months and several
> hundred posts ago, think of the time we could have saved. Oh
> wait he did.
> He can do no more than
> follow the syntactic instructions in the program: if input =
> "squiggle" then output
> "squoogle". 
> Wow, now I see the error of my ways! It's a
> pity Swobe didn't say that two months and several
> hundred posts ago, think of the time we could have saved. Oh
> wait he did.
> syntactic order does not
> give understanding.
> Wow, now I see the error of my ways! It's a
> pity Swobe didn't say that two months and several
> hundred posts ago, think of the time we could have saved. Oh
> wait he did.
> formal syntax does
> not give semantics
> Wow, now I see the error of my ways!
> It's a pity Swobe didn't say that two months and
> several hundred posts ago, think of the time we could have
> saved. Oh wait he did.
>  John K Clark
>  


      



More information about the extropy-chat mailing list