[ExI] Is the brain a digital computer?

Spencer Campbell lacertilian at gmail.com
Fri Feb 26 23:59:41 UTC 2010


Yeah, this is going to be a long one.

Gordon Swobe <gts_2000 at yahoo.com>:
> Spencer Campbell <lacertilian at gmail.com>:
>> This is meaningless and misleading. Thoughts aren't even real things.
>
> Sorry to hear then that you can't really think.
>
> I can really think real thoughts. <- I even typed that one

Well of course I can really think, but I don't really have real*
thoughts. I imagine I have thoughts. Everyone does. My thoughts are
imaginary. In a world where thoughts were real, we would be able to
build devices capable of extracting thoughts mechanically.

Here and now, we can only extract thoughts from people informatically.
As you have just successfully done with me.

* I am using my idiosyncratic definitions again here. Thoughts exist,
but are not real. I have not yet figured out a more sensible
terminology for the distinction I have in mind, but it's the same
thing that causes Searle to call computations observer-dependent or
whatever term he used.


Gordon Swobe <gts_2000 at yahoo.com>:
> Material perturbations in brain matter, I would say. If you really alter or eliminate the real brain matter, you will really alter or eliminate the real thoughts. So it looks like thoughts really exist as real material events in the real brain.

"Material event" almost seems like a contradiction in terms, but okay.
I will play this game.

No, if you alter or eliminate brain matter, real or imaginary, you
will NOT alter or eliminate thoughts, real or imaginary. This would
only be the case if thoughts were stored in the brain (or the mind, if
you prefer). They are not. Thoughts aren't *stored* anywhere, they
just unpredictably pop in and out of existence in a vaguely-defined
space. Thoughts are events, not objects; taking the view that they
exist as discrete things to begin with, such as in the manner of a
quasiparticle, you are still forced to conclude that they are
ever-changing.

So it's nonsense to talk about "altering" thoughts. They alter
themselves a whole lot faster than you could ever hope to through
crude brain surgery.

I could just as easily make the case that it's also nonsense to talk
about "eliminating" a discrete thought, but that's fuzzier territory;
obviously we can eliminate a whole lot of thoughts at once, using
nothing more sophisticated than a crowbar and a swift swinging motion.
But, that is really more like preventing new thoughts from occurring.
The thoughts present at the moment of impact would have been long gone
anyway by the time consciousness fades.


Chris Luebcke <cluebcke at yahoo.com>:
> I believe that the only workable definition of the noun 'a thought' is, fundamentally, 'a statement', which is certainly information, certainly can be held in media other than brains, and certainly does not have mass.

I was using Gordon's definition here, namely:

"at time t when the brain exists in conscious state s, the conscious
thought at t exists in/as a particular configuration of brain matter"

I like yours a lot better. It implies that thoughts can be recorded
quite easily, which says something very interesting about the
following two thoughts:

"I can really think real thoughts. <- I even typed that one"

and:

"I do not consider information on a hard drive as constitutive of
thought, for the same reason that I do not believe newspapers think
about the words printed on them."

So an email message can contain thoughts, but newspapers can't. And
hard drives can't either, in spite of the fact they can contain emails
(which, of course, can).

This does not make sense. Gordon's formulation equates thinking with
"having thoughts", but the word "have" is surprisingly ambiguous. It
may either mean to possess (I have a computer) or to contain (my
computer has files) or even simply to be attached to (I have arms, my
computer has a keyboard).

Gordon considers thoughts as configurations of brain matter*, which
implies that we can, in principle, literally print thoughts into the
brain; just as we can with a hard drive or a newspaper. Now, I imagine
when he reads this he will instantly decide to accuse me of some kind
of homunculus fallacy since the words on a newspaper only mean
something when we interpret them to. A thought printed on neurons
generates semantics from within! How? By interacting with the rest of
the brain, of course.

Yet, thoughts printed on a hard drive can surely interact with the
rest of the hard drive. It isn't as obvious that this is so, and in
fact it is not necessarily always so. I'm assuming a certain sort of
program is being run on the computer in question. Probably the easiest
to imagine would simply turn the whole hard drive into one big
two-dimensional cellular automaton, so that it is clear that every
piece of information stored on it is constantly interacting (at least
indirectly) with every other.

*Incidentally, in addition to supporting an internally inconsistent
view, I'm pretty certain that this is just flat out wrong in and of
itself. Thoughts aren't physical, local structures that you can excise
at will. I see now why Gordon thinks they have mass; he sees them as
literal clumps of grey goo, tiny local substructures in the greater
system. That actually makes good intuitive sense, and it's pretty
close to how things work in a hard drive, but it shares little in
common with observable reality. Punky should make this pretty obvious.
Consider "I must eat worms" to be a thought, and you'll see what I
mean.

http://www.indiana.edu/~pietsch/shufflebrain.html

(Compliments of Jeff Davis, Extropy-Chat, circa February 13th)


Gordon Swobe <gts_2000 at yahoo.com>:
> The Turing test defines (weak) AI and neurons cannot take the Turing test,
> so I don't know what it means to speak of an AI behaving like a neuron.

Blue Brain.

http://en.wikipedia.org/wiki/Blue_Brain_Project

If you prefer, you could just put a digital computer inside a robotic
neuron. I think this is more like what Stathis is hinting at, and it's
why I asked you once upon a time whether or not you considered the
Internet itself to be one large digital computer.

The logic goes like this: two computers, networked together,
constitute a single computer. This means that any number of networked
computers can be considered one computer.

Say you built a brain out of synthetic computerized neurons (SCNs),
whose apparent structure matched an ordinary brain but whose low-level
behavior was dictated programmatically (that is, each neuron is
operating according to a copy of the same program). If those SCNs are
computers, in spite of the fact they have little artificial axons and
such, then the whole brain is a computer.

So, Gordon, I have a humble request. A plea, really. I actually do
feel pangs of guilt whenever I single you out like this, but it's
pretty much unavoidable

Please visualize, as precisely as you possibly can, just what an SCN
is. Don't just apply labels to it, calling it this or that, deciding
on the fly what properties it has. No. Think about what it would
literally look like, what would be involved in its construction and
its programming.

Only then, after you've gotten a firm grasp on the concrete technology
involved, would I like to hear whether you believe (a) that an SCN is
a digital computer or (b) that an SCN is theoretically capable of
perfectly replicating the external behavior of a naturally-grown
neuron (NGN).

There. Maybe *this* repackaging of the same old idea will get through!




Christopher Luebcke <cluebcke at yahoo.com>:
> Whether it's
>   T is caused by B
>   B has the property m
> or
>   T is contained in B
>   B has the property m
> You cannot, from either of these cases, deduce that
>   T has the property m
> You could only deduce that statement if you modified the original statements such that
>   T is B

You beat me to it, and did a better job than I would have to boot!

Magnificent.

Sadly, it does not look like the kind of thing Gordon responds to.
Indeed, he merely cherry-picked the very last sentence, and twisted
the meaning rather egregiously in the process:

Christopher: "Surely you're not claiming that thoughts are brain matter?"

Gordon: "I do claim that conscious thoughts arise as high-level
features of brain matter, yes."

The breakdown in communication is, I think, self-evident. What to do about it?



More information about the extropy-chat mailing list