[ExI] Bringing new life to dead matter

The Avantguardian avantguardian2020 at yahoo.com
Wed Jun 6 12:15:11 UTC 2012


----- Original Message -----
> From: Kelly Anderson <kellycoinguy at gmail.com>
> To: The Avantguardian <avantguardian2020 at yahoo.com>; ExI chat list <extropy-chat at lists.extropy.org>
> Cc: 
> Sent: Tuesday, June 5, 2012 11:30 PM
> Subject: Re: [ExI] Bringing new life to dead matter
> 
> On Mon, Jun 4, 2012 at 5:34 PM, The Avantguardian
> <avantguardian2020 at yahoo.com> wrote:
>> ----- Original Message -----
>>> From: Kelly Anderson <kellycoinguy at gmail.com>
>>> To: ExI chat list <extropy-chat at lists.extropy.org>
>>> Cc:
>>> Sent: Sunday, June 3, 2012 4:28 AM
>>> Subject: Re: [ExI] Bringing new life to dead matter

>>>  All chemicals work
>>> with information, but informational processing doesn't equal
>>> consciousness. Most people don't consider computers conscious (yet)
>>> but they process way more information than a plant.
>> 
>> Last I heard human brains process information electrochemically with 
> chemical neurotransmitters. Plants are a network of living cells. Nervous 
> systems are a network of living cells. The key difference could simply be 
> connectivity. Neurons are each connected to more other neurons than plant cells 
> which are simply connected to the neighboring cells they touch.
>> 
> 
> And speed. Speed is always part of the measure of intelligence.

Granted. And I would agree that nervous systems should be faster and more intelligent than plants. But at least we seem to agree that plant intelligence and animal intelligence might lie along the same continuum.

>> One possible way to define a "unit" of consciousness is through 
> an environmental sensory-behaviorial feedback loop. For example a simple 
> thermostat might be considered conscious in a rudimentary way: It senses the 
> temperature of it's environment and then turns on or off the HVAC system 
> accordingly. Computers sense and respond to their users unless one gives 
> them senses other than a keyboard, then they can be programmed to respond to 
> other stimuli. So I guess what I am asking is that if consiousness is *not* 
> information processing then what else might it be?
> 
> OK... maybe I'm defining consciousness more in terms of "self
> awareness"... I don't know how self awareness is related to
> consciousness, but when I think of consciousness, that's sort of the
> kind of thing that pops into my head. And no, I do not think that a
> switch is self aware. Plants? I dunno.

Ahh. Ok, now I understand that your definition of consciousness includes self-awareness, we might get somewhere interesting. Yes, I agree that a single sensory behaviorial feedback loop is most likely not self-aware. It is, however, aware of the single environmental parameter that it "senses". In the case of a thermostat that would be temperature. Now imagine a very large ensemble of individual S-B feedback loops that each measure and regulate different but very specific environmental parameters: luminousity, specific wavelengths of light, temperature, pressure, saltiness, pH, glucose concentration, concentration of nucleotides, and so on. Let us agree to call the generalization of these individual feedback loops "homeostats" since the word thermostat specifically refers to temperatures.

Now increase the number of parameters that have S-B feedback loops that measure and respond to those parameters until they number in the thousands. Then the multivariate homeostat has emerged into something akin to a living cell in complexity and intelligence. As those feedback loops become still more numerous and syncretic, the thermostat becomes aware of higher-order environmental parameters that emerge out of the complexity of sensory inputs: movement, color, sounds, smells, flavors, textures, hunger, pain, pleasure, moistness, etc. Now the multivariate homeostat is on par with an animal like a mouse or something. Now increase the number and complexity of S-B feedback loops still further until the multivariate homeostat actually assesses and responds to complex behavior of other more or less similar homeostats. Now you have a complex homeostat that amounts to a social animal like a primate or something.
 
The big question is how and where the phenomenon of self-awareness emerges. Perhaps when all possible environmental parameters are exhaustively accounted for, the self is all that remains unaccounted for. Perhaps self-awareness is defined by its own incomprehensibility. Like a blind-spot in our model of the world that by the simple process of elimination *must* be ourselves. Perhaps consciousness is defined by what it is not:
 
NOT("you" OR "environment") = "me"  


Stuart LaForge


"Man is a strange animal, he doesn't like to read the handwriting on the wall until his back is up against it."  -Adlai Stevenson





More information about the extropy-chat mailing list