[ExI] digital simulations, descriptions and copies

Stathis Papaioannou stathisp at gmail.com
Thu Jan 21 22:41:37 UTC 2010

2010/1/22 Gordon Swobe <gts_2000 at yahoo.com>:
> --- On Thu, 1/21/10, Stathis Papaioannou <stathisp at gmail.com> wrote:
>> But finding a physical correlate does not provide an
>> "explanation". I can stubbornly point out that there is no logical
>> pathway from a lump of matter to meaning, even if there is an apparent
>> correlation.
> To say "there is no logical pathway from a lump of matter to meaning" is equivalent to saying that mind and matter exist in separate realms. It seems then that you really do want to espouse the mind/matter dualism handed down to us from Descartes.

I'm saying this to show where your assertion that syntax can't produce
meaning leads.

>> This is at least as convincing as your assertion that syntax
>> can't produce meaning.
> That's just strictly logical argument. You don't like it, but it remains nevertheless true that the man in the chinese room cannot understand the meanings of the symbols merely from manipulating them according to syntactic rules the way computers actually do. At least you (and nobody else) have not shown how that miracle can happen.

It also remains strictly true that a lump of matter cannot produce
meaning. Put a whole *mountain* of matter in a room and talk to it in
Chinese for a million years. Will it understand Chinese? No it won't!
So how can organising the matter in a special way, whether in a brain
or in a computer, produce meaning when the meaning just isn't there to
begin with?

>> All you can do then is point to the brain and say, but there is the
>> proof, it thinks, you just have to accept it as a raw fact.
> Yes. That's all I can do.
>> So why can't someone point to a computer and say the same thing
> People can point all they want, but they need to explain how a program and its hardware can get semantics from the syntactic rules programmed into the machine by the programmer. We don't know exactly how the natural brain does it, but it sure looks like it cannot do it *that* way.

As I and others have said numerous times, it's quite obvious that
meaning could *only* come from the association of one symbol or input
with another symbol or input. But in case you still don't accept that,
and if you are not bothered by saying that dumb matter acquires
understanding even though on the face of it seems impossible, you can
still say that computers have understanding by virtue of the matter
that they contain rather than by virtue of the programs they run.

Stathis Papaioannou

More information about the extropy-chat mailing list