[extropy-chat] Fragmentation of computations

Lee Corbin lcorbin at rawbw.com
Mon Mar 26 17:04:15 UTC 2007

Stathis writes

> > Hmm, I suppose that what you and Chalmers
> > are really asserting is that the subject has fewer
> > "qualia" over time, but only insofar---to rephrase it
> > ---as he is becoming less an actual subject moment
> > by moment.

> No, the argument asserts that he *can't* have fewer experiences over time

Oh, okay.

>  (that's a standard English word with the same meaning as "qualia") as his
> neurons or squares are being replaced, so that gradual zombification is
> impossible. This is because you can't have a half-zombie state where, for
> example, half your neurons are replaced and although the whole person
> says "yes, I can see the light" for the benefit of external observers, internally
> you are thinking "I can't see the light"....

I agree that you cannot be having different internal thoughts. Yes indeed, this
would mean that a bizarre state had been reached by the subject's brain.
Since we are just rehearsing previous deterministic runs, that would, yes,
be quite impossible.

As you write

> your consciousness would have to magically drift off in a different direction,
> decoupled from the physical activity presumed to be underpinning it. And if
> gradual zombification by gradual replacement cannot happen, then sudden
> total zombification when the last neuron or last square is replaced also cannot
> happen, for it is absurd to think that your entire consciousness could be
> sustained by one neuron or one square. 

Well, the zombification that I am talking about works quite differently.
Suppose for a moment that I am right about states having to be
causally connected in order for there to be information flow, and
in order for there to be an internal experiencer.  Then it would follow
that a sequence of looked up states could not be conscious. Then
one would have a classic zombie.

That is, if we are presented with a robot that appears to be conscious,
and behaves absolutely no differently from anything we could expect,
then I might under extremely fantastical conditions be persuaded that
it is a zombie.  These conditions would be that (i) we were ourselves
merely being recomputed from sessions that our original selves had
particapted in long ago,  (ii) the robot's states are not being computed,
but merely copied from that long-ago run.  In this case, a lookup
table is very simple:  the next state of the robot is just fetched from
an ordered list.  It's as though we were seeing a movie of the robot.

So I'll say that theoretically a zombie is possible.  It's just as Eugen
pointed out, the resources required are ridiculous.  Certainly any
device before me that passed the Turing Test, would be, in my
opinion quite conscious.  But I would be assuming that its successive
states would be being computed.


P.S.  I am hoping that this also answers some of Russell's objections,
or at least makes clearer what I was trying to say to him.

More information about the extropy-chat mailing list