[ExI] Is Artificial Life Conscious?

Brent Allsop brent.allsop at gmail.com
Sun Apr 24 22:36:49 UTC 2022


Hi Jason,

Yes, we've got to at include the ExI list, for Stathis' (and other's?)
sake.  He is in the more popular Functionalist
<https://canonizer.com/topic/88-Theories-of-Consciousness/8-Functional-Prprty-Dualism>
camp,
which you, like so many, appear to agree with.

The Molecular Material
<https://canonizer.com/topic/88-Theories-of-Consciousness/36-Molecular-Materialism>
camp
just considers "philosophical zombies" to be absurd.  Especially since they
are normally defined in a way that different qualia may or may not
'superven' on top of identical physical reality.  this is just absurd and
is not scientifically falsifiable.  I prefer falsifiable
theoretical science to unfalsifiable philosophy.  Also, as we point out in
our video: functionalists are no better than dualists, as they
separate qualia from physical reality.  Despite how many times I've asked
Stathis for a way to falsify his theory, he has yet to describe how
functionalism may be falsified.  So to me, it is no better than dualism.
All he seems to do is qualia, like redness and greenness, aren't possible,
because they, themselves are substrate on which consciousness is composed.

And of course I've considered "tetrachromats" which have 4 primary colors,
and shimp that must have a lot more than that.  As I always say, I pity the
bi (2 primary colors or color blind people) or even worse the achromatic
(black and white only) people, and can't wait till I (a mere trichromat)
discover what not only it is like for a tetrachromat, but what it is like
for all those 16 primary color shrimp.  A brain like that is what I want to
be uploaded to, and how many more physical colors could be discovered after
that????  Even if we discover hundreds, with many thousands of shades of
each, that is still a long way from infinite.

And you still seem to be missing something when you say: "in any conscious
state one finds oneself [in], one can only ever know... that one state."
and the way you talk about computational binding like: "less-than or
greater-than comparison operations, equality tests".  These kinds of
comparisons are always done between specific things or facts of the
matter.  That's what computation abou tobjects is.  Your one composite
qualitative experience of the strawberry includes both redness and greenness.

[image: 3_robots_tiny.png]
While it is true, all of these 3 different systems can function the same.
In that they can all distinguish between and tell you the strawberry is
red, or not.  But that is missing the point of the factual qualitative
differences of each of these, and the physical qualities they are using to
represent these differences, or the fact that their knowledge is
intentionally abstracted away from any physical qualities that may be
representing them in a way that requires a dictionary.  You can't get
substrate independence, without a dictionary for each different
representation that may or maynot be representing the ones or zeros.
Representing knowledge like the fist two does not require a dictionary,
which is far more efficient than the 3rd, which does required an additional
dictionary.  The same way software runs faster directly on physical
hardware, vs running on virtual machines (requires a functional mapping
dictionary to different functioning hardware).

Which brings me to the 3rd strongest form of effing the ineffable, which
was portrayed in the movie avatar <https://youtu.be/Uf9SWvs4beE?t=12> with
Neural ponytails.  These could function like the Corpus collosum which can
computationally bind knowledge represented in the left hemisphere with
knowledge represented in the right.  With a neural ponytail like that, you
would experience all of the experience, not just half.  If the first two
systems in the above image (one's redness is like your greenness) they
would directly experience this difference, just the same as if your left
field of vision was looking through red/green inverted glasses.  It is
called "4. the strongest form" of effing the ineffable, because what you
directly apprehend is infallible or cannot be doubted the same way  "I
think, therefore I am" cannot be doubted.

Brent




















On Sun, Apr 24, 2022 at 3:05 PM Jason Resch <jasonresch at gmail.com> wrote:

>
>
> On Sun, Apr 24, 2022, 4:17 PM Brent Allsop <brent.allsop at gmail.com> wrote:
>
>>
>>
>> Hi Jason, This is GREAT!  You clearly understand a LOT about
>> consciousness, but there are two minor things I believe you are missing.
>>
>
> Thank you for saying so and for helping clear up any gaps in my
> understanding.
>
>
>
>>
>> *First off*, let’s distinguish between an elemental quale(singular) and
>> computationally bound composite qualia(plural).  There is LOTS of other
>> memory stuff bound in with that elemental redness you experience when you
>> look at that red strawberry.
>>
>
> I agree with this. There were some experiments done recently that found
> the speed at which one could pick out different colors was related to color
> names that exist in one's native language. It got me wondering whether the
> quale of colors might be shaded by labels from one's language centers of
> the brain.
>
>   All of this, together is a composite, computationally bound conscious
>> experience.  And sure, there are an infinite number of different possible
>> composite experiences, just as there are an infinite number of paintings
>> which can be painted with a finite set of elemental colors.
>>
>
> True but I think it goes beyond this. The raw qualia of consciousness are,
> I think, as varied as the objects and relationships that exist in
> mathematics.
>
>
>
>> [image: 3_robots_tiny.png]
>>
>>
>>
>> The prediction is that there is an elemental quale level out of which all
>> composite qualia are composed.  Just like it is a fact that there is a
>> finite set of physical elements (at least that we currently know of) there
>> is a finite set of elemental qualia that all humans have experienced to
>> date.
>>
>
> Have you considered human tetrachromats? People with four types of color
> sensing cones that can distinguish 100s of times more colors than normally
> sighted humans?
>
> Also, there are relational differences between the colors. In that black
> is a singular straw while white cones in various degrees of brightness.
> Blue can be seen as the subtraction of yellow from white, while red can be
> seen as the subtraction of green from yellow. So at their basic level,
> there is no symmetry between the primary colors, they each have a uniquely
> defined relationship in the three dimensional color space.
>
>
> (Obviously a falsifiable prediction, but until it is falsified.... we
>> don't yet need more primary colors).
>>
>
> There are so e shrimp, I believe that have 16 different color sensing
> cones. Does that not mean that in theory their brain could construct a 16
> dimensional color space with 16 different primary colors?
>
>
> The quality you experience when you look at the strawberry in this picture
>> is just an objectively observable physical fact.
>>
>
> I would challenge the assertion that it is a physical fact, on the basis
> that the number of computational states a brain or computer could
> instantiate is untethered to the physics of this universe. Any universe in
> which it is possible to build a computer can instantiate the computational
> relations that may be found in any brain or computer of any other universe.
> This is a direct consequence of the "universality" property of computers.
> Why any computer may compute is independent of it's substrate, material
> make up, or architecture.
>
>   The prediction is that there is a finite set of these elemental
>> intrinsic qualities, like redness and greenness, out of which brains can
>> compose an infinite number of conscious experiences.
>>
>
>
> This is an interesting idea. For what it's worth I think it could be true,
> but I would argue that the "atoms" of consciousness exist at a much lower
> level than red or green (which seem to require vast resources of billions
> of neurons to construct our visual field). Instead I think the raw elements
> of consciousness, if they exist, would be at more fundamental levels:
>
> if-statements, less-than or greater-than comparison operations, equality
> tests, truth tables, etc. I think these operations are the most primitive
> forms of reacting to or responding to information, and a more complex quale
> is built up of some combination of these relations that can create
> arbitrarily complex and large state-spaces.
>
>
>>
>> *Second*, you made a falsifiable claim:  “we are only ever privileged to
>> know "what it is like" in a single particular view at a single particular
>> instant.”
>>
>>
>>
>> Then you backtracked with this:
>>
>>
>>
>> “If certain functions or behaviors require a first person
>> experience/experiencer, then this may provide a method by which we could
>> study or at least detect consciousness in others.”
>>
>
> I don't consider that a backtrack. My first point is only meant to say: in
> any conscious state one finds oneself, one can only ever know and that one
> state. Even a consideration of a memory is just a particular case of a
> single conscious state existing at one moment in time.
>
> My second point is meant to suggest my belief that philosophical zombies
> are impossible. Though false appearances of "non conscious actions that
> appear consciously driven" can exist for arbitrarily long lengths of time,
> they become exponentially unlikely to continue with increasing time.
>
> The impossibility of zombies has consequences for many theories of mind,
> including molecular materialism.
>
> Does molecular materialism predict philosophical zombies are possible?
>
>
>
>>
>> I’m in the Molecular Material
>> <https://canonizer.com/topic/88-Theories-of-Consciousness/36-Molecular-Materialism>
>> camp which is predicting you are not quite backtracking in the right way.
>> As I said, above, the quality your brain uses to represent red things with
>> is just a physical fact. It is only a matter of time till we objectively
>> discover exactly what your redness is.  Once we objectively know that,
>> (that will give us the required dictionary for the term "your redness") we
>> will then be able to objectively observe whether this is the same
>> definition for "my redness".  And this is only the "1. Weakest form of
>> effing the ineffable."  There are also the "2: stronger firms?", and "3:
>> Strongest forms" of effing the ineffable.
>>
>>
>> For more information about the difference between “*perceiving*”
>> physical facts vs “*directly apprehending* intrinsic qualities of
>> knowledge”, check out the “Differentiating between reality and knowledge
>> of reality
>> <https://canonizer.com/videos/consciousness/?chapter=differentiate_reality_knowledge>”
>> chapter in our video.
>>
>>
>>
>
> Thank you. I agree there's a difference between reality and knowledge of
> reality, though by definition the only parts of reality that can be known
> is the knowledge of reality (in other words, those that emerge as states of
> consciousness).
>
> Perhaps this is the basis of the Hindu belief that Brahman (all of
> reality) is Atman (all of consciousness).
>
>
> P.S. I noticed in my reply I accidentally dropped the extropy chat list.
> Feel free to copy that list in your reply, or we can keep this one to one,
> whatever your preference may be.
>
> Jason
>
>
>
>> On Sun, Apr 24, 2022 at 10:00 AM Jason Resch <jasonresch at gmail.com>
>> wrote:
>>
>>>
>>>
>>> On Sun, Apr 24, 2022, 4:09 AM Brent Allsop <brent.allsop at gmail.com>
>>> wrote:
>>>
>>>>
>>>> Hi Jason,
>>>>
>>>> You had a long list of attributes in one of your replies to that list.
>>>>
>>>> *Consciousness is:*
>>>>
>>>>    - Awareness of Information
>>>>    - A knowledge State
>>>>    - An Infinite Class (infinite possible variations and permutations,
>>>>    configurations)
>>>>    - A requirement for: Experience, Thought, Feeling, Knowing, Seeing,
>>>>    Noticing (can any of these things exist absent consciousness? E.g. some
>>>>    part of system that acts like it knows must really know.)
>>>>    - An activity (not a passive state of 0s and 1s,
>>>>    operations/behavior/actions give meaning and context to information and how
>>>>    it is processed and what it means)
>>>>    - Is it a recursive relationship? A model of environment including
>>>>    self?
>>>>    - Is it undefinable?
>>>>    - Word origin: "con" (together/with/unified/united) "scious"
>>>>    (knowledge): unified knowledge
>>>>    - It exists in the abstract informational state, not in the material
>>>>    - A meaningful interpretation of information
>>>>
>>>> *Information is:*
>>>>
>>>>    - A difference that makes a difference
>>>>    - A comparison, differentiation, distinction
>>>>    - Specification / Indication
>>>>    - Negative entropy
>>>>    - A decrease in uncertainty
>>>>    - A probability of being in different states
>>>>    - Bits, digits, a number (representations of information)
>>>>    - A subspace of a larger space
>>>>    - A state of a finite state machine
>>>>    - Requires an interpreter (A system to be informed) to be meaningful
>>>>
>>>> *A subject is:*
>>>>
>>>>    - A system to be informed
>>>>    - A processor of information
>>>>    - A knower (a believer)
>>>>    - An inside viewer
>>>>    - A first-person
>>>>    - A possessor of knowledge
>>>>    - An interpreter of information
>>>>    - A modeler of environment or self (or both)
>>>>
>>>> *Knowledge is:*
>>>>
>>>>    - An apprehended truth
>>>>    - A true belief (bet)
>>>>    - Not always shareable (when self-referential)
>>>>    - A relationship between two objects or object and itself
>>>>
>>>>
>>>> [image: 3_robots_tiny.png]
>>>> In addition to all 3 of these systems being able to tell you something
>>>> is red, each could also be engineered do everything in your list.
>>>>
>>>> Consciousness is less about functionality, than it is about "what is it
>>>> like".
>>>>
>>>
>>> I can agree with this statement. However we are only ever privileged to
>>> know "what it is like" in a single particular view at a single particular
>>> instant.
>>>
>>> When studying the plausible existence of this phenomenon in others, we
>>> are limited to analyzing third person observable properties, such as
>>> material composition, information content, and activities such as function
>>> and behavior.
>>>
>>> If certain functions or behaviors require a first person
>>> experience/experiencer, then this may provide a method by which we could
>>> study or at least detect consciousness in others.
>>>
>>>
>>>> A redness quality is not an intrinsic quality of the strawberry, it is
>>>> a quality of your conscious knowledge of a strawberry.
>>>>
>>>
>>> I agree redness is a feature that exists only in some conscious minds.
>>>
>>> I believe there are infinite ways a mind can be organized and therefore
>>> sn infinite number of possible qualia. There's no fundamental limit on the
>>> number of colors that could exist and be perceived, even the number of
>>> primary colors is in theory infinite. What any of them "look like" comes
>>> down entirely to the mind in question.
>>>
>>> The only difference between the first 2 is a red green signal inverter
>>>> in its optic nerve, changing the definition of redness for that one.  You
>>>> need a dictionary to know what the word red means.  The intrinsic colorness
>>>> quality your brains represents knowledge of red things with is your
>>>> definition of the word red.
>>>>
>>>
>>> Then there's really no common meaning of the word red, given the
>>> different brains involved, just as there's no common meaning of the taste
>>> of cilantro (which tastes like an herb to some people and like soap to
>>> other people).
>>>
>>>
>>>> All 3 could also be minimally engineered so that all they can do is
>>>> tell you the strawberry, nothing more, including most of the stuff in your
>>>> list.  But I would still classify the first two as conscious, since both of
>>>> their conscious knowledge is like something.  The first one's redness is
>>>> like your redness, the second one's is like your greenness, and the third
>>>> one represents knowledge with strings of ones and zeros, all of which you
>>>> need a dictionary to first know what the hardware representing each one or
>>>> zero is, and after that, again, to know what the word 'red', made up of
>>>> strings of those ones and zeros represents.
>>>>
>>>
>>> Though perhaps differently consciousness, would you still say the third
>>> one is conscious? Is it like something to be able to discriminate between
>>> two or more possibilities and know which one it is? Even if that knowledge
>>> is of a single bit, is that not some small element of conscious awareness
>>> for which it is something to be like?
>>>
>>> Jason
>>>
>>>>
>>>>
>>>>
>>>>
>>>> On Sat, Apr 23, 2022 at 11:55 PM Jason Resch via extropy-chat <
>>>> extropy-chat at lists.extropy.org> wrote:
>>>>
>>>>> I recently posted this question recently to the everything list
>>>>> <https://groups.google.com/g/everything-list/c/Ga8KWzjM_dk>, but I
>>>>> know many here are also deeply interested in the topic of consciousness so,
>>>>> I thought I should post it to this group too. My question was:
>>>>>
>>>>> These "artificial life" forms, (seen here
>>>>> <https://www.youtube.com/playlist?list=PLq_mdJjNRPT11IF4NFyLcIWJ1C0Z3hTAX>),
>>>>> have neural networks that evolved through natural selection, can adapt to a
>>>>> changing environment, and can learn to distinguish between "food" and
>>>>> "poison" in their environment.
>>>>>
>>>>> If simple creatures like worms or insects are conscious, (because they
>>>>> have brains, and evolved), then wouldn't these artificial life forms be
>>>>> conscious for the same reasons?
>>>>>
>>>>> Why or why not?
>>>>>
>>>>> Jason
>>>>> _______________________________________________
>>>>> extropy-chat mailing list
>>>>> extropy-chat at lists.extropy.org
>>>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>>>
>>>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20220424/7c485625/attachment-0001.htm>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: 3_robots_tiny.png
Type: image/png
Size: 26214 bytes
Desc: not available
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20220424/7c485625/attachment-0003.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: 3_robots_tiny.png
Type: image/png
Size: 26214 bytes
Desc: not available
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20220424/7c485625/attachment-0004.png>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: 3_robots_tiny.png
Type: image/png
Size: 26214 bytes
Desc: not available
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20220424/7c485625/attachment-0005.png>


More information about the extropy-chat mailing list