[ExI] Coherent vs. Incoherent Fears of Being Uploaded

Stathis Papaioannou stathisp at gmail.com
Thu Jan 21 12:53:00 UTC 2010


2010/1/21 Emlyn <emlynoregan at gmail.com>:

> Yet, if you were to compare how you think, even through an abstract,
> multi-step problem, you'll find that it's nothing like a computer
> might do it. You have access to states in the thought process, but not
> the program which is creating that process. If you think about it,
> most of your thoughts are only loosely connected; to have strict
> derivation from the start to an end of a train of thought really
> requires us to carefully write down everything, interpolate missing
> pieces after the fact, tease out assumptions, and so on.

Presumably this is just because of the haphazard way evolution put the
brain together.

> What that tells me is that, while the aware/experiencing "you" is a
> definite part of the process (intricately involved in kicking off
> unrelated directions of inquiry, as you say below), it's by no means
> most of it, or even the most important part. I think, rather than
> claiming that "you" originate your thoughts, in fact you merely
> receive them from elsewhere (newer, cleverer parts of the cortex I
> guess), like they are written on a slate, and you read them off,
> mistaking them for your own creations.

It seems that the newer, cleverer parts of the brain are the ones
whose work we are most aware of. The oldest parts of the nervous
system phylogenetically are probably things like the ganglions
regulating gut motility, and we are not aware of the thinking, such as
it is, that those do. The parts of the cortex perhaps unique to humans
involve language and abstract thought, modelling ourselves as entities
in the world, and these things require a lot of self awareness.

> Don't you find it suspicious that the pieces of your cognition that
> are least relevant to an abstract task (like arithmetic) are the ones
> you most readily feel and experience? I think we have many of these
> "unnecessary" experiences with abstract thought is because the part of
> our brains which is most subjectively "us" is very poor at the kind of
> linear, algorithmic work that is involved, and in fact is probably not
> really doing that work, it's doing something else - free associating?

The brain does a lot of very complex calculations in, for example,
visual processing, which we are completely unaware of. We are only
aware of the final result. But perhaps it is correct to say that the
final result *is* the awareness of the processing in aggregate, seeing
the whole picture and recognising it rather than looking at each
individual pixel.

> Well, I can't see why we couldn't make an AI which was a "zombie".
> And, that AI could easily be "self aware" in a sense (ie: have a model
> of itself in the world, have a model of itself as being an entity with
> a model of itself in the world, etc), without having a sense of
> subjective experience. It'd be easy to determine too; it just wouldn't
> understand what you meant by subjective experience.

What we call AI's today may be zombies, but I am not sure that an AI
that could have prolonged contact with humans and fool them into
thinking it was one of them could be a zombie. Daniel Dennett has
argued that it would have to have zombie beliefs etc. which would be
indistinguishable from real beliefs. And David Chalmers' Fading Qualia
argument, which I have described at length, implies that if zombies
are possible then we might all be zombies and not realise it,
suggesting that there is no logical distinction between a conscious
being and its zombie equivalent.


-- 
Stathis Papaioannou



More information about the extropy-chat mailing list