[ExI] Digital Consciousness .

Brent Allsop brent.allsop at canonizer.com
Thu Apr 25 20:38:14 UTC 2013


Yes Kelly,

You're definitely making progress.  There are only a few more minor issues
you don't seem to be quite fully grasping the significance of yet.

First off, if you are a functional property dualist, that is certainly yet
another easily falsifiable theory that has yet to be experimentally
falsified.  In other words, if it is only glutamate that has redness,
experiments will bear this out, the same for if any function can reproduce
a redness experience.  It's only a matter of time till someone
experimentally demonstrates which of these is the case – forcing all
experts to the same camp.

Nextly, you seem to be missing the significance of what happens when, as
you substitute the glutamate, with virtual glutamate, when you say if it
walks like a duck…  When you substitute real glutamate, the prediction is
you will experience some kind of fading quale, which Chalmers says is a
possibility and which is being predicting by these theories.  In other
words, if it is your brain where we are doing the neural substitution, as
long as you are using the real merging system, including the corpus
callosum, that is merging all these elemental qualities into your combined
painted conscious experience, emotions and all, all the elemental qualities
you replace with abstracted representations of such, will 'fade' from your
painted consciousness, as they are removed, from the real thing that knows
when something is or isn't real glutamate.  The prediction as that nothing
that is not real glutamate will ever produce real redness for you.  And of
course, you will be able to eventually replace the entire binding system,
with abstracted stuff that is only being interpreted as the same thing in
that it is using the interpretation of the abstracted stuff that is not
fundamentally anything like glutamate, and not the real thing.  And only
once you have this interpretation layer fully in place, will the abstracted
system start to claim that the virtual glutamate, is the real thing.  And
of course, due to Occam's razor, we must assume that it is a zombie, and
though it is claiming to be really experiencing real redness, we will know
that it is really something fundamentally very different, only being
interpreted as that - and the real fundamental stuff, both causal and
qualitatively, does not exist in the stuff that is very different, and is
only being interpreted as the same.  In other words, when it is your mind
being substituted, it will stop acting like a duck and the red qualities
will ‘fade’ in some way, whenever you take away real glutamate.  There will
be no way to bridge this gap, to validate if that stuff which is being
interpreted as the real thing, produces the real thing, so just like we
shouldn’t believe in the existence of purple unicorns, because there is no
evidence for such, the same applies.  But for real glutamate, you will be
able to validate it with our brain, via such a substitution process.
Nothing but real glutamate will have your redness quality.

As Stathis pointed out, Chalmers' neural substitution argument is a general
idea, which works for most all theories.  And this general idea can also be
similarly demonstrated to be invalid in the same way with functional
property dualism theories.  All you do is replace the glutamate in this
theoretical idealized effing world with whatever it is that has, or is
reliably responsible for, or is the neural correlate of a redness
experience.  James Carroll likes to call this functional stuff that has a
redness quality a functionally active pattern or "FAP".  This is because he
admits that a static set of ones and zeros does not have a redness quale
until it becomes functional, in some way.  So all you do is replace the
glutamate, with whatever this Functionally Active Pattern or FAP is, or
whatever your theory predicts has the redness quality.

Anders is talking about "level independent consciousness.”  If science
proves your theory, that a redness quality can ‘arise’ from anything, as
long as it is functioning correctly, whether it is greenness that is being
interpreted as redness or whatever, “level independent consciousness” will
then be proven possible, and we may be in an abstracted simulation
Likewise, if only some material substance, like say glutamate, is the only
thing that science can show to your brain that has a redness quality we can
experience, it will demonstrably prove that “level independent
consciousness” is not possible, and that we are not in an abstracted
simulation.  Though we could still be in phenomenal simulation, where stuff
with real redness, in the basement world, is being used in the simulation.

So, in this idealized functional effing theory world, which is predicted by
your camp’s working hypothesis, it is still possible to prove that Chalmers
idea isn't a proof through this same thought process.  You still suffer
from the quale interpretation problem in that it is possible for anything,
including some string of binary numbers, like say a “1”, which doesn’t have
the redness quality, to be interpreted as if it is the redness quality.
But of course, by definition, it will not have the functionally active
patter, from which the redness quality arises, and will only be like it, to
the degree that you have overcome the quale interpretation problem, and you
are able to look up this one in a dictionary, and find the real glutamate,
um, I mean, the real functionally active patter, and only think of this
very different pattern as if it is representing that real redness quality.


Brent Allsop










On Thu, Apr 25, 2013 at 10:54 AM, Kelly Anderson <kellycoinguy at gmail.com>wrote:

> On Thu, Apr 25, 2013 at 9:15 AM, Brent Allsop <brent.allsop at canonizer.com>wrote:
>
>> Hi Stathis,
>>
>> (And Kelly Anderson, tell me if given what we've covered, does the below
>> make sense to you?)
>>
>> It is not a 'proof' that abstracted computers can be conscious.  It
>> completely ignores many theoretical possible realities. For example
>> Material Property Dualism is one of many possible theories that proves this
>> is not a 'proof".
>>
>> There is now an "idealized effing theory" world described in the Macro
>> Material Property Dualism camp: http://canonizer.com/topic.asp/88/36 .
>>
>
> I would fall in the camp of Functional Property Dualism, I think. Unless
> there is some other camp that is even more explicit.
>
>> In that theoretically possible world, it is the neurotransmitter
>> glutamate that has the element redness quality.  In this theoretical world
>> Glutamate causally behaves the way it does, because of it's redness
>> quality.  Yet this causal behavior reflects 'white' light, and this is why
>> we think of it has having a 'whiteness' quality.  But of course, that is
>> the classic example of the quale interpretation problem (see:
>> http://canonizer.com/topic.asp/88/28 ).  If we interpret the causal
>> properties of something with a redness quality to it, and represent our
>> knowledge of such with something that is qualitatively very different, we
>> are missing and blind to what is important about the qualitative nature of
>> glutamate, and why it behaves the way it does.
>>
>
> I don't believe that. I believe "redness" is an emergent illusion
> constructed by the brain in software, and has as much to do with the
> glutamate as Word has to do with Accumulators and Assembly.
>
>
>> So, let's just forget about the redness quality for a bit, and just talk
>> about the real fundamental causal properties of glutamate in this
>> theoretical idealizing effing world.  In this world, the brain is
>> essentially a high fidelity detector of real glutamate.  The only time the
>> brain will say: "Yes, that is my redness quality" is when real glutamate,
>> with it's real causal properties are detected.  Nothing else will produce
>> that answer, except real fundamental glutamate.
>>
>
> I totally disagree, but I do understand your position better.
>
>
>> Of course, as described in Chalmers' paper, you can also replace the
>> system that is detecting the real glutamate, with an abstracted system that
>> has appropriate hardware translation levels for everything that is being
>> interpreted as being real causal properties of real glutamate, so once you
>> do this, this system, no matter what hardware it is running on, can be
>> thought of, or interpreted as acting like it is detecting real glutamate.
>>
>
> Correct. And that's as "real" as the other in my mind.
>
>
>> But, of course, that is precisely the problem, and how this idea is
>> completely missing what is important.  And this theory is falsifiably
>> predicting the alternate possibility he describes in that paper.  it is
>> predicting you'll have some type of 'fading quale', at least until you
>> replace all of what is required, to interpret something very different than
>> real consciousness, as consciousness.
>>
>
> If it walks like a duck, and quacks like a duck, who is to say that it is
> not a duck?
>
>
>> It is certainly theoretically possible, that the real causal properties
>> of glutamate are behaving the way they do, because of it's redness
>> quality.  And that anything else that is being interpreted as the same, can
>> be interpreted as such - but that's all it will be.  An interpretation of
>> something that is fundamentally, and possibly qualitatively, very different
>> than real glutamate.
>>
>
> As Spike says, if it is qualitatively different, but still delivers me a
> Big Mac when I order it, I'm good with that for many purposes.
>
>
>>
>> This one theoretical possibility, thereby, proves Chalmers' idea isn't a
>> proof that abstracted computers have these phenomenal qualities, only that
>> they can be thought of, or interpreted as having them.
>>
>
> I'm of the camp that will believe something has consciousness when it says
> it does, and when it acts in every way as if it does. Good enough for me. I
> guess that puts me in a different cannon?
>
> -Kelly
>
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20130425/a3a10d69/attachment.html>


More information about the extropy-chat mailing list