[ExI] Zombie glutamate
brent.allsop at canonizer.com
Tue Feb 17 15:36:10 UTC 2015
You keep saying: "You can't prove if something else is conscious." But,
does your left brain hemisphere not know, more than we know anything, not
only that your right hemisphere is it conscious, but what it is
qualitatively like. And if that is possible, why are you assuming we can't
do the same thing the corpus callosum is doing, between brains, not just
between brain hemispheres?
You and Stathis keep talking about separating consciousness from behavior.
If we are talking about real glutamate vs zombie glutamate, you must agree
that real glutamate can behave the way it does, because of it's intrinsic
physical glutamate properties. Where as, even though zombie glutamate can
behave the same way, it can only do so if it has interpretation hardware
that interprets that which, by definition does not have glutamate
properties, as if it did. So this proves it is possible to reproduce
zombie glutamate (or zombie functional isomorph, if you must) behavior,
without consciousness, um I mean without real glutamate intrinsic
properties (hint: these are the same thing). So I don't understand why
both of you seem to be so completely missing the obvious? It seems to me
that both of you continue to completely ignore these simple obvious facts?
On Mon, Feb 16, 2015 at 10:10 PM, Stathis Papaioannou <stathisp at gmail.com>
> On 17 February 2015 at 14:16, John Clark <johnkclark at gmail.com> wrote:
> > On Mon, Feb 16, 2015 Stathis Papaioannou <stathisp at gmail.com> wrote:
> >> > you would HAVE to behave normally, by definition. The artificial
> >> visual cortex receives input from the optic tracts, processes it, and
> >> sends output to association cortex and motor cortex. That is its
> >> design specification.
> > Then behavior would be the same. And I assume that, although functionally
> > identical with the same logical schematic, this artificial visual cortex
> > uses a different substrate such as electronics; otherwise the thought
> > experiment wouldn't be worth much.
> >> > That is its ONLY design specification: it is made by engineers who
> >> > consciousness is bullshit. My point is that such a device would, as an
> >> > unintended side-effect, necessarily preserve consciousness.
> > I think so too, I would bet my life on it but I can't prove it. I can't
> > prove or disprove that blind people aren't conscious because it's the
> > biological visual cortex itself that produces consciousness. And I can't
> > prove or disprove that people lacking a left big toe are not conscious
> > because it is that toe that generates consciousness. I think both logical
> > possibilities are equally likely.
> >> > If it were possible to make a brain implant that did all the
> >> > stuff perfectly but lacked consciousness then you would end up with a
> >> > that was blind
> > The being had a working visual cortex, how could it be blind?
> Because the visual cortex is perfectly functional according to any
> test you do on it but lacks consciousness. It is made by engineers who
> think consciousness is bullshit.
> >> > but behaved normally and thought it could see normally.
> > And the being was correct, it could see; it was probably conscious too
> > it could certainty see.
> >> > But that is absurd
> > I'm still not seeing what's absurd.
> If it is possible to separate consciousness from function then it is
> possible to make a visual cortex that has normal function but lacks
> consciousness, so if you put it into your brain you would lack all
> visual perception but function normally and believe you could see
> normally. That would be absurd - I think you have agreed. Therefore,
> it is not possible to make a functional analogue of your visual cortex
> that lacks consciousness. The conscious comes as a necessary
> side-effect, whether you want it there or not.
> Stathis Papaioannou
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat