[ExI] Coherent vs. Incoherent Fears of Being Uploaded

Emlyn emlynoregan at gmail.com
Thu Jan 21 12:03:20 UTC 2010


2010/1/21 Stathis Papaioannou <stathisp at gmail.com>:
> 2010/1/21 Emlyn <emlynoregan at gmail.com>:
>
>>> Do you really think it would be easier to make a device with
>>> subjectivity than without? I think it is more likely that the
>>> subjectivity is a necessary side-effect of the information processing
>>> underpinning it.
>>
>> I don't know how subjective experience works at all. But notice that
>> we don't use feelings about things for any higher order cognitive
>> tasks; it's always used in "gut" (ie: instinctive)
>> reactions/decisions, broad sweeping heuristic changes to other parts
>> of the brains (eg: Angry! Give more weight to aggressive/punitive
>> measures!), simple fast decision making (hurts! pull away!).
>
> You may have a narrower understanding of "feelings" or "qualia" than I
> do. Every cognitive task involves a feeling, insofar as I am aware
> that I am doing it.

"Insofar as I am aware that I am doing it". Exactly! What is it, to be
aware of what you are doing? It is to feel.

Yet, if you were to compare how you think, even through an abstract,
multi-step problem, you'll find that it's nothing like a computer
might do it. You have access to states in the thought process, but not
the program which is creating that process. If you think about it,
most of your thoughts are only loosely connected; to have strict
derivation from the start to an end of a train of thought really
requires us to carefully write down everything, interpolate missing
pieces after the fact, tease out assumptions, and so on.

What that tells me is that, while the aware/experiencing "you" is a
definite part of the process (intricately involved in kicking off
unrelated directions of inquiry, as you say below), it's by no means
most of it, or even the most important part. I think, rather than
claiming that "you" originate your thoughts, in fact you merely
receive them from elsewhere (newer, cleverer parts of the cortex I
guess), like they are written on a slate, and you read them off,
mistaking them for your own creations.

> If I am counting, I am aware of the feeling of
> counting. Computers can count, but I really doubt that they have
> anything like my feeling.

Also, *how do you count*? I can tell you how a computer counts, down
to the electrical signals in the hardware, but we cannot by inspection
know very much about how we think when we do really abstract stuff.
Well in fact I think counting is probably largely sequential recall
for small numbers, and clunky execution of an algorithm for larger
numbers, but that indeed is where the mechanism gets hazy, don't you
think?

> When I count, I sort of have a grand view of
> all the numbers yet to come, an awareness that I am counting and the
> reason I am counting, a vision of each number as it is would appear
> written down, an association with significant numbers from other
> aspects of my life, and so on. A computer does not do these things,
> because they are for the most part not necessary for the task.
> However, a conscious computer would reflect on its actions in this
> way, except perhaps when it devoted a subroutine to a mundane task,
> equivalent to a human subconsciously digesting his food.

Don't you find it suspicious that the pieces of your cognition that
are least relevant to an abstract task (like arithmetic) are the ones
you most readily feel and experience? I think we have many of these
"unnecessary" experiences with abstract thought is because the part of
our brains which is most subjectively "us" is very poor at the kind of
linear, algorithmic work that is involved, and in fact is probably not
really doing that work, it's doing something else - free associating?

>
>> When I think closely about what my subjective experience is, sans all
>> the information processing, I find there's just not much left for it
>> to do, except that it feels things (including "qualia"), and
>> communicates the results of that to the rest of the brain. How does
>> that happen? Buggered if I know. Why not just "simulate" feeling, in
>> an information processing kind of way? You've got me there, that's
>> exactly what I would do if I had to replicate it.
>
> How do you know that simulating feelings won't actually produce
> feelings?

I don't know that, I wont claim to.

> It could be that the feeling is *nothing more* than the
> system observing itself observe, something along the lines of what Jef
> has been saying.

People have been waving their hands in this direction for years; that
our subjective awareness is the result of us being conscious of being
conscious in a tightening loop that magically produces us. Maybe. I
don't think that's right though. I say this because I can imagine
subjective awareness without the machinery needed to be "self aware"
in the sense that humans are; or at least without being aware of being
aware. That's why, for instance, I think many animals (not just
dolphins/monkeys/octopii/et al) are conscious.

> If this is so, then it would be impossible to make a
> zombie. People are still tempted to say that it is *conceivable* to
> make a zombie, but maybe conceivable is too loose a word. It is
> "conceivable" that 13 is not prime, but that doesn't mean much, since
> 13 is definitely prime in all possible worlds. Perhaps if we
> understood cognition well enough its necessary association with
> consciousness would be as clear as the necessary association between
> 13 and primeness.

Well, I can't see why we couldn't make an AI which was a "zombie".
And, that AI could easily be "self aware" in a sense (ie: have a model
of itself in the world, have a model of itself as being an entity with
a model of itself in the world, etc), without having a sense of
subjective experience. It'd be easy to determine too; it just wouldn't
understand what you meant by subjective experience.

-- 
Emlyn

http://www.songsofmiseryanddespair.com - My show, Fringe 2010
http://point7.wordpress.com - My blog



More information about the extropy-chat mailing list