[ExI] multiple realizability

Spencer Campbell lacertilian at gmail.com
Mon Feb 1 02:47:55 UTC 2010


Gordon Swobe <gts_2000 at yahoo.com>:
> The layer of abstraction does not matter to me.

Bad move. You've been attacked before on the basis that you have
trouble comprehending the importance of abstraction. How far down
through the layers one can go before new complexities cease to emerge
is a tremendous component of the argument against formal programs
being capable of creating consciousness.

To prove this is trivial. All I have to do is invoke a couple of black boxes:

One box contains my brain, and another box contains a standard digital
computer running the best possible simulation of my brain. Both brains
begin in exactly the same state. A single question is sent into each
box at the same moment, and the response coming out of the other side
is identical.

This is the highest level of abstraction, turning whole brains into,
essentially, pseudo-random number generators. They carry states; an
input is combined with the state through a definite function; the
state changes; an output is produced.

Gordon has said before that in situations like these, it is impossible
to determine whether or not either box has consciousness without
"exploratory surgery". I assume Gordon is at least as good a surgeon
as Almighty God, and has unlimited computing power available to
analyze the resulting data instantaneously.

The point is that such surgery is precisely the sort of process which
reduces the level of abstraction. A crash course may be in order.

You are given ten thousand people. You ask, "How many have blue
eyes?". The number splits into two, becoming less abstract. You ask,
"How many are taller than I am?". Now there are four numbers, and one
quarter the abstraction. Eventually any question you ask will be
redundant, as you will have split the population into ten thousand
groups of one. But there is still some abstraction left: people are
not fundamental particles. So you ask enough questions to uniquely
identify every proton, neutron, electron, and any other relevant
components. Yet still your description is abstract, because you've
only differentiated the particles: you haven't determined their exact
locations in space.

And here, in a universe equipped with the Heisenberg uncertainty
principle, we find that you can't. The description is still abstract.
It can be made less so, as we expend greater and greater sums of
energy to pin down ever more precise configurations of human beings,
but to eliminate abstraction entirely would require infinite energy.

In this thread Gordon explicitly rejects the notion that a mind can be
copied, in whole, to another substrate without a catastrophic
interruption in subjective experience. I agree with this, but I think
it's for a completely different reason. I can't say for sure because
his clarification made things less clear.

Proposition A: a machine operating by formal programs cannot replicate
the human mind.
Proposition B: a neural network could conceivably replicate the human mind.
Logical Conclusion: an individual human mind cannot be extracted from
its neurological material.

This does not appear to follow, unless you were counting artificial
neural networks as "neurological material". I understood you to mean
the specific neurons responsible for instantiating the mind in
question originally. By my understanding, that one experiment in which
you replace each individual neuron with an artificial duplicate, one
by one, would preserve the same conscious mind you started with.

Actually I am kind of counting on this last point being true, so I
have a vested interest in finding out whether or not it is. If you can
convince me of my error before I actually act on it, Gordon, I would
appreciate it.

For the record, I am a dualist in the sense that I believe minds are
distinct entities from brains, as well as that programs are distinct
entities from computers. However, I do not believe that minds or
programs are composed of a "substance" in any sense. Both are
insubstantial. Software (which I say includes minds) is one layer of
abstraction higher than its supporting hardware (which I say includes
brains), and therefore one order of magnitude less "real".

I'm not sure what the radix is for that order of magnitude, but I am
absolutely confident that it is exactly one order!



More information about the extropy-chat mailing list