[ExI] digital simulations, descriptions and copies

Eric Messick eric at m056832107.syzygy.com
Sun Jan 24 05:52:27 UTC 2010


Gordon writes:
>--- On Sat, 1/23/10, Eric Messick <eric at m056832107.syzygy.com> wrote:
>> Hey, no need to disparage my remarks with the label "religious"!
>
>Sure looks like religion to me!

Care to explain how?

I'm not seeing any deity.  I'm not seeing any ritual, any
superstition, any of the usual things associated with religion.  All
I'm seeing is a different axiomatic choice.  Would you say that
Euclidean versus non-Euclidean geometry is a religious issue?

>That "neural firing pattern" amounts to mindless software running on
> some computer.

Yes, it's software running on a computer.  The whole question here is
whether or not there is a mind, so assuming there isn't one isn't
going to help answer that question.

Non-Euclidean geometry is actually pretty useful and interesting, so
assuming that parallel lines intersect yields some interesting
results.  You've got this axiom that minds cannot be implemented in
software.  Does assuming that actually lead to any interesting or
useful results?

Actually, as I've pointed out before, I'm not quite sure what your
assumption is.  You've never managed to define your terms clearly
enough to have actually made a coherent statement of your axiom.

Is it:
  Syntax can never produce semantics.
or:
  Software can never be part of a mind.
or:
  Mind can never be simulated.
or:
  Consciousness is not a computational process.
?

All of these statements are similar.  You're asserting an absolute
disjunction between the sets: (software, syntax, simulation,
computation) and (semantics, mind, consciousness, meaning,
understanding).  While the first set is relatively well defined, the
second set is quite slippery.  You've got this idea that the second
set is this really special thing, but all you can say about it
is that it can't be built out of any of those other things.


>> If no one has actually enjoyed an apple, who wrote the
>> email?
>
>A program wrote it, one like Eliza (if you remember her) but perhaps
> smart enough to fool you.

A program was running, but nothing even remotely like that email was
designed into it (unlike Eliza).

The production of that email was an emergent behavior of 100 billion
copies of the same relatively simple program all interacting with each
other.  Why are you so eager to put limits on what 100 billion simple
entities can do when each is connected to thousands of others?

Stomping your feet and insisting that it just *can't* be that way
doesn't keep a system with a quadrillion connections from
spontaneously creating a behavior that you can't understand.

-eric



More information about the extropy-chat mailing list