[ExI] Is the brain a digital computer?

Stathis Papaioannou stathisp at gmail.com
Wed Feb 24 06:22:27 UTC 2010


On 24 February 2010 02:31, Gordon Swobe <gts_2000 at yahoo.com> wrote:
> --- On Tue, 2/23/10, Stathis Papaioannou <stathisp at gmail.com> wrote:
>
>> I entertain the possibility that perhaps consciousness is different to
>> other phenomena in the universe
>
> I posit nothing special about consciousness. It only *seems* unusual because we can know about it only from the first-person perspective. Only you can feel your toothache, for example.
>
>> and might be separable from the
>> observable behaviour it seems to underpin.
>
> I can write a program today that will make a computer act conscious to a limited degree. I have many times. With enough knowledge and resources I could write one that fooled you into thinking it had consciousness -- that caused a computer or robot to behave in such a way that it passed the Turing test. So I don't understand why you should even question the separability of behavior and consciousness.

You are only able to program a computer so that it has very limited
and specialised intelligence, like an amoeba or a flatworm, and
therefore proportionately limited consciousness. I understand that you
say the difference is only a matter of degree, but then the difference
between a flatworm and a mouse or a human is also just a matter of
degree.

>> That would mean I could replace part of my brain with a functionally
>> identical but unconscious analogue selectively removing any aspect of my
>> consciousness without noticing that anything had changed and without
>> displaying any outward change in behaviour. I believe that is absurd,
>> and this leads me to conclude that those who immediately saw that B'
>> must be conscious were right.
>
> We've been through this so many times. :)
>
> One cannot on my view make one of your "functionally identical but unconscious analogues" in the first place if the component normally affects experience, at least not without changing other parts of the brain along with it. One might just as well try to draw a square triangle.

So you claim both that weak AI is possible and that weak AI is
impossible? If weak AI is possible then by definition it is possible
to make an artificial neuron, collection of neurons or person that
behaves just like a biological neuron, collection of neurons or person
but lacks consciousness. If it does not behave identically then you
have failed in your effort to create weak AI. I think I started by
pointing this out in the very first posts on these threads as the
position of Roger Penrose, who thinks that the NCC is something
fundamentally non-algorithmic, incorporating exotic physics that no
Turing machine could compute. The result of this would be that it is
impossible in general to simulate the behaviour of any organism or
part of an organism that contains a NCC by using a digital computer. A
computer might be able to do some tasks that are considered
intelligent, but it would never be able to reproduce the full
behavioural gamut of a real human, perhaps failing in tasks involving
creativity or natural language, for example. This position, unlike
yours and Searle's, is internally consistent, but there is no
scientific evidence for it.

> The undertaking becomes problematic because replacing the neural correlates of consciousness or any part of them with a *supposed* functional but unconscious analogue will eliminate or compromise subjective experience. Experience affects behavior in normal people, and because the subject will have abnormal experience, the doctor will then need to do more work to make him behave and report normally.

The NCC will, of course, affect the behaviour of the whole brain and
hence person. My claim is that this effect on behaviour *is* the NCC,
so that if it is reproduced, whether by a digital computer, beer cans
and toilet paper or a little man pulling levers, then the
consciousness will also be reproduced. I still don't understand your
position on simulating the NCC because you keep alternating between,

(a) it is impossible to reproduce the behaviour of the NCC using a computer; and
(b) it is possible to reproduce the behaviour of the NCC using a
computer but this still won't reproduce the behaviour of the rest of
the brain.

As I have said, (a) is philosophically sound but there is no
scientific evidence in its support, while (b) is worse than wrong, it
is contradictory.


-- 
Stathis Papaioannou



More information about the extropy-chat mailing list