[ExI] A new theory of consciousness: conditionalism

William Flynn Wallace foozler83 at gmail.com
Sat Aug 26 17:20:50 UTC 2023


I am not fully understanding 'range of gradations of consciousness'.  What
for example is 'partly conscious'?  Or the difference between an amoeba's
consciousness and ours?  billw

On Sat, Aug 26, 2023 at 11:08 AM Jason Resch via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

>
>
> On Sat, Aug 26, 2023, 10:38 AM William Flynn Wallace via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> Hard enough to define consciousness.  How about knowledge?  Where does it
>> start?  The body knows thousands of things to do - digesting food,
>> responding to pain, reflexes like kneejerks, various emotions - all of
>> these built in.
>>
>
>
> One implications of this is that there may be many independent minds
> operating within our bodies and brains. Within a reflexive, for example,
> are neurons conditionally reacting to a stimulus. The consciousness of such
> a reflex would be very simple, however, rather like that of a thermostat.
>
> I think something like this is necessary as I will elaborate on below.
>
>
> Then you have CRs, like staying away from a hot stove on which you have
>> burned yourself.
>>
>> Then reinforcement type knowledge - what to do to gain
>> positive reinforcers and avoid punishments.  Verbal knowledge.  Motor
>> knowledge.  Etc.
>>
>> All animals, down to the amoeba, possess reflexes. A bit up from that are
>> conditioned reflexes.
>>
>> So - just how are you using the term 'knowledge' in your discussion of
>> consciousness?    If knowledge equal consciousness then by some definitions
>> the amoeba is conscious   bill w
>>
>
> I think consciousness was an early introduction in life. It may have begun
> in it's simplest form with bacteria that react to light, or touch. I think
> a range of gradations of consciousness is necessary in the phylogenetic
> tree, as otherwise we face the prospect of unconscious "zombie" parents
> giving birth to a fully conscious self-aware child.
>
> Jason
>
>
>
>>
>> On Sat, Aug 26, 2023 at 9:28 AM Jason Resch via extropy-chat <
>> extropy-chat at lists.extropy.org> wrote:
>>
>>> Thank you John for your thoughts. I few notes below:
>>>
>>> On Sat, Aug 26, 2023 at 7:17 AM John Clark <johnkclark at gmail.com> wrote:
>>>
>>>> On Fri, Aug 25, 2023 at 1:47 PM Jason Resch <jasonresch at gmail.com>
>>>> wrote:
>>>>
>>>> *> At a high level, states of consciousness are states of knowledge,*
>>>>>
>>>>
>>>> That is certainly true, but what about the reverse, does a high
>>>> state of knowledge imply consciousness?  I'll never be able to prove it but
>>>> I believe it does but of course for this idea to be practical there must be
>>>> some way of demonstrating that the thing in question does indeed have
>>>> a high state of knowledge, and the test for that is the Turing Test,
>>>> and the fact that my fellow human beings have passed the Turing test is the
>>>> only reason I believe that I am NOT the only conscious being in the
>>>> universe.
>>>>
>>>
>>> Yes, I believe there's an identity between states of knowledge and
>>> states of consciousness. That is almost implicit in the definition of
>>> consciousness:
>>> con- means "with"
>>> -scious- means "knowledge"
>>> -ness means "the state of being"
>>> con-scious-ness -> the state of being with knowledge.
>>>
>>> Then, the question becomes: what is a state of knowledge? How do we
>>> implement or instantiate a knowledge state, physically or otherwise?
>>>
>>> My intuition is that it requires a process of differentiation, such that
>>> some truth becomes entangled with the system's existence.
>>>
>>>
>>>>
>>>> *> A conditional is a means by which a system can enter/reach a state
>>>>> of knowledge (i.e. a state of consciousness) if and only if some fact is
>>>>> true.*
>>>>>
>>>>
>>>> Then "conditional" is not a useful philosophical term because you could
>>>> be conscious of and know a lot about Greek mythology. but none of it is
>>>> true except for the fact that Greek mythology is about Greek mythology.
>>>>
>>>
>>> Yes. Here, the truth doesn't have to be some objective truth, it can be
>>> truth of what causes ones mind to reach a particular state. E.g., here it
>>> would be the truth of what particular sensory data came into the scholar's
>>> eyes as he read a book of Greek mythology.
>>>
>>>
>>>
>>>> >  *Consciousness is revealed as an immaterial, ephemeral relation,
>>>>> not any particular physical thing we can point at or hold.*
>>>>>
>>>>
>>>> I mostly agree with that but that doesn't imply there's anything
>>>> mystical going on, information is also immaterial and you can't point to *ANY
>>>> PARTICULAR* physical thing
>>>>
>>>
>>> I agree.
>>>
>>>  (although you can always point to *SOME *physical thing) and I believe
>>>> it's a brute fact that consciousness is the way information feels when it
>>>> is being processed intelligently.
>>>>
>>>
>>> I like this analogy, but I think it is incomplete. Can information (by
>>> itself) feel? Can information (by itself) have meaning?
>>>
>>> I see value in making a distinction between information and "the system
>>> to be informed." I think the pair are necessary for there to be meaning, or
>>> consciousness.
>>>
>>>
>>> However there is nothing ephemeral about information, as far as we can
>>>> tell the laws of physics are unitary, that is information can't be
>>>> destroyed and the probability of all possible outcomes must add up to 100%.
>>>> For a while Stephen Hawking thought that Black Holes destroyed information
>>>> but he later changed his mind, Kip Thorne still thinks it may do so but he
>>>> is in the minority.
>>>>
>>>
>>> I agree information can't be destroyed. But note that what I called
>>> ephemeral was the conditional relation, which (at least usually) seems to
>>> occur and last during a short time.
>>>
>>>
>>>
>>>>
>>>> *> All we need to do is link some action to a state of knowledge.*
>>>>>
>>>>
>>>> At the most fundamental level that pretty much defines what a computer
>>>> programmer does to make a living.
>>>>
>>>
>>> Yes.
>>>
>>>
>>>
>>>> * > It shows the close relationship between consciousness and
>>>>> information, where information is defined as "a difference that makes a
>>>>> difference",*
>>>>>
>>>>
>>>> And the smallest difference that still makes a difference is the
>>>> difference between one and zero, or on and off.
>>>>
>>>
>>> The bit is the simplest unit of information, but interestingly, there
>>> can also be fractional bits. For example, if there's a 75% chance of some
>>> event, like two coin tossings not both being heads, and I tell you that two
>>> coin tossings were not both heads, then I have only
>>> communicated -log2(0.75) ~= 0.415 bits of information to you.
>>>
>>>
>>>
>>>> > *It shows a close relationship between consciousness and
>>>>> computationalism,*
>>>>>
>>>>
>>>> I strongly agree with that,  it makes no difference if the thing doing
>>>> that computation is carbon-based and wet and squishy, or silicon-based and
>>>> dry and hard.
>>>>
>>>
>>> Absolutely  ��
>>>
>>>
>>>>  >  It is also supportive of functionalism and it's multiple
>>>>> realizability, as there are many possibile physical arrangements that lead
>>>>> to conditionals.
>>>>
>>>>
>>>> YES!
>>>>
>>>> *> It's clear there neural networks firings is all about conditionals
>>>>> and combining them in whether or not a neuron will fire and which other
>>>>> neurons have fired binds up many conditional relations into one larger
>>>>> one. It seems no intelligent (reactive, deliberative, contemplative,
>>>>> reflective, etc.) process can be made that does not contain at least some
>>>>> conditionals. As without them, there can be no responsiveness. This
>>>>> explains the biological necessity to evolve conditionals and apply them in
>>>>> the guidance of behavior. In other words, consciousness (states of
>>>>> knowledge) would be strictly necessary for intelligence to evolve.*
>>>>>
>>>>
>>>> I agree with all of that.
>>>>
>>>
>>> Happy to hear that. Thanks for all your feedback.
>>>
>>> Jason
>>>
>>>
>>>  John K Clark    See what's on my new list at  Extropolis
>>>> <https://groups.google.com/g/extropolis>
>>>> xex
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> You received this message because you are subscribed to the Google
>>>> Groups "Everything List" group.
>>>> To unsubscribe from this group and stop receiving emails from it, send
>>>> an email to everything-list+unsubscribe at googlegroups.com.
>>>> To view this discussion on the web visit
>>>> https://groups.google.com/d/msgid/everything-list/CAJPayv0q60k%3DqoWMbNsAOVxG_qotkyV8TJhN8-vNLoMg7Pu48A%40mail.gmail.com
>>>> <https://groups.google.com/d/msgid/everything-list/CAJPayv0q60k%3DqoWMbNsAOVxG_qotkyV8TJhN8-vNLoMg7Pu48A%40mail.gmail.com?utm_medium=email&utm_source=footer>
>>>> .
>>>>
>>> _______________________________________________
>>> extropy-chat mailing list
>>> extropy-chat at lists.extropy.org
>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>
>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230826/73c5ec32/attachment.htm>


More information about the extropy-chat mailing list