[ExI] Coherent vs. Incoherent Fears of Being Uploaded
stathisp at gmail.com
Thu Jan 21 11:33:44 UTC 2010
2010/1/21 Emlyn <emlynoregan at gmail.com>:
>> Do you really think it would be easier to make a device with
>> subjectivity than without? I think it is more likely that the
>> subjectivity is a necessary side-effect of the information processing
>> underpinning it.
> I don't know how subjective experience works at all. But notice that
> we don't use feelings about things for any higher order cognitive
> tasks; it's always used in "gut" (ie: instinctive)
> reactions/decisions, broad sweeping heuristic changes to other parts
> of the brains (eg: Angry! Give more weight to aggressive/punitive
> measures!), simple fast decision making (hurts! pull away!).
You may have a narrower understanding of "feelings" or "qualia" than I
do. Every cognitive task involves a feeling, insofar as I am aware
that I am doing it. If I am counting, I am aware of the feeling of
counting. Computers can count, but I really doubt that they have
anything like my feeling. When I count, I sort of have a grand view of
all the numbers yet to come, an awareness that I am counting and the
reason I am counting, a vision of each number as it is would appear
written down, an association with significant numbers from other
aspects of my life, and so on. A computer does not do these things,
because they are for the most part not necessary for the task.
However, a conscious computer would reflect on its actions in this
way, except perhaps when it devoted a subroutine to a mundane task,
equivalent to a human subconsciously digesting his food.
> When I think closely about what my subjective experience is, sans all
> the information processing, I find there's just not much left for it
> to do, except that it feels things (including "qualia"), and
> communicates the results of that to the rest of the brain. How does
> that happen? Buggered if I know. Why not just "simulate" feeling, in
> an information processing kind of way? You've got me there, that's
> exactly what I would do if I had to replicate it.
How do you know that simulating feelings won't actually produce
feelings? It could be that the feeling is *nothing more* than the
system observing itself observe, something along the lines of what Jef
has been saying. If this is so, then it would be impossible to make a
zombie. People are still tempted to say that it is *conceivable* to
make a zombie, but maybe conceivable is too loose a word. It is
"conceivable" that 13 is not prime, but that doesn't mean much, since
13 is definitely prime in all possible worlds. Perhaps if we
understood cognition well enough its necessary association with
consciousness would be as clear as the necessary association between
13 and primeness.
More information about the extropy-chat