[ExI] Zombie glutamate
stathisp at gmail.com
Tue Feb 17 00:07:09 UTC 2015
On 17 February 2015 at 10:07, John Clark <johnkclark at gmail.com> wrote:
>> > Essentially as you said above - if it were possible to separate
>> > consciousness from behaviour
> That's a big "if" but OK.
I believe that this is false, but that is the point of the argument -
assume that zombie components are possible and see where it leads.
>> > it should be possible to make a visual cortex which functions normally
>> > and put it in your brain.
>> > You would then lack visual perception - which is the definition of
>> > blindness
>> > but you would behave normally
> No, you wouldn't behave normally. If I threw a ball at you you'd be unable
> to catch it, the artificial visual cortex might be able to track the ball
> but that's only a small part of the brain, other parts, the conscious parts,
> decide that if would be fun to catch the ball the ball and then send nerve
> impulses to the muscles in your arm to actually do so. But you wouldn't
> decide to catch the ball because you are blind and didn't even know that a
> ball had been thrown.
But you would HAVE to behave normally, by definition. The artificial
visual cortex receives input from the optic tracts, processes it, and
sends output to association cortex and motor cortex. That is its
design specification. That is its ONLY design specification: it is
made by engineers who think consciousness is bullshit. My point is
that such a device would, as an unintended side-effect, necessarily
preserve consciousness. If it were possible to make a brain implant
that did all the mechanistic stuff perfectly but lacked consciousness
then you would end up with a being that was blind but behaved normally
and thought it could see normally. But that is absurd - so it isn't
possible to make such an implant. (It might be possible if
consciousness is due to an immaterial soul - but as I said at the
start, the assumption is that it is due to processes in the brain).
More information about the extropy-chat