[extropy-chat] Fragmentation of computations
lcorbin at rawbw.com
Fri Mar 30 14:46:02 UTC 2007
> > When you write "I suppose that it isn't impossible that the subject's
> > consciousness is rapidly flickering during this interval", you are
> > perhaps referring to the subjective quality of the experience. To the
> > degree that you are so referring, I don't look at it quite in the same
> > way. There would, to me, be absolutely no perception of any
> > flickering, or of anything unusual at all. It's just that the objective
> > *amount* of consciousness going on there inside the system must
> > (if all my hypotheses are right) be diminished by some fraction.
> Yes, it would be rather as if I were dead every other second: I wouldn't
> notice anything, but my consciousness per unit time would be halved.
> Actually, I don't know if this would be such a bad thing, if zombie me
> didn't do anything I wouldn't do during the dead periods. Suppose I were
> offered a 10% salary increase if I volunteered to be dead half the time.
> If I took it up, I would only be able to really enjoy 55% of my current
> salary; on the other hand, I would *think* I was enjoying 110% of my
> current salary, at the cost of only a twinge of discomfort from knowing
> what was really going on. Should I take up the offer?
This is an example, quite interesting itself, where the language of "benefit"
clarifies the situation. One very likely shouldn't live at all if one does not
have a life worth living. Likewise, the 10% increase in salary will deliver
some amount of benefit. So this reduces to the question of whether on
the whole, given say that we exist only finitely long in one particular
universe, it would be better to live twice as long (but with reduced
benefits). Of course, it your example here, a ten percent increase in
salary doesn't sound like much.
By the way, thanks very much for the careful understanding you extended
to a position that you don't really agree with.
> You can't harm the consciousness in a rock by crushing the rock,
> because the whole idea is that it does not depend on any particular
> configuration of matter. In the final analysis, the consciousness is
> entirely contained in the not-actually-realized table mapping
> arbitrary physical states to computational states...
> Thus, this rock-is-conscious idea is another way of saying that all
> conscious computations are realized merely by virtue of their
> status as platonic objects. Strange though it may seem, it is
> consistent with the basic idea of functionalism, which is that
> consciousness resides in the form, not the substance.
Yes, I guess that's right. Perhaps Putnam was right: functionalism
if taken to extremes does lead to the entirely counter-intuitive idea
that rocks and dust---and even collections of photons anywhere---
undertake all the same computations and have all the same
experiences as we do. (I consider that to be surely quite wrong.)
So then let me amend the question that I was asking:
It seems quite inescapable that conscious
robots could, and shortly will exist, and that it will be possible to
take such a program and single-step through its deterministic
execution. And that such a program---either perhaps suffering
horribly or gaining a great deal of satisfaction---compels us to make
a moral choice. But if rocks continue to be conscious whether
pulverized or not, as does any system that can take on many states,
(together with a fantastically loose definition of "system"), then of
what special status or value are humans and animals? Is caring for
another human being completely inconsequential because either
saving them from grief or inflicting grief upon them doesn't change
the platonic realities at all?
(It's funny that in all the discussions of abstract computing, I don't recall
much discussion of the repercussions in our daily lives of accepting
the implications of these weird doctrines.)
By clinging to the "ad hoc" causal requirement that there has to be
local information flow in a local system for any of the things we value
to obtain, I escape the dilemma. What do you do?
More information about the extropy-chat