[ExI] Is Artificial Life Conscious?

Brent Allsop brent.allsop at gmail.com
Tue May 10 14:00:10 UTC 2022


Yes, that is all true, but it is still missing the point.
There must be something in the system which has a coolorness quality.
You must be able to change redness to greenness, and if you do, the system
must be able to report that the quality has changed.
If that functionality is not included somewhere in the system, it does not
have sufficient functionality to be considered conscious.



On Tue, May 10, 2022 at 7:48 AM Stathis Papaioannou via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

>
>
> On Tue, 10 May 2022 at 23:06, Brent Allsop via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>>
>> Hi Stathis,
>> [image: 3_robots_tiny.png]
>>
>> We can say *functionality* is multiply realizable, the above systems
>> being different examples of the same knowledge of the strawberry
>> functionality.
>> We can say the same for *intelligence*.
>> But if you define "*consciousness*" to be computationally bound
>> elemental intrinsic qualities, like redness and greenness, that is
>> basically saying it is important to ask questions like what is your
>> consciousness like?  Which of the above 3 qualities are you using to paint
>> your conscious knowledge of the strawberry with?
>>
>> And given that definition of "*consciousness*" isn't this the
>> opposite of:
>> "No, I don’t think *consciousness* can be tied to any particular
>> [colorness quality of the] substrate or structure."
>>
>
> If the three subjects differ in their behaviour, such as their description
> of the strawberry or the nature of the redness experience, then they have
> different qualia. If they say exactly the same things under all possible
> circumstances about strawberries, redness, greenness and everything else
> they have the same qualia.
>
>
>> On Mon, May 9, 2022 at 8:27 PM Stathis Papaioannou via extropy-chat <
>> extropy-chat at lists.extropy.org> wrote:
>>
>>>
>>>
>>> On Tue, 10 May 2022 at 12:12, Brent Allsop via extropy-chat <
>>> extropy-chat at lists.extropy.org> wrote:
>>>
>>>> Right, so you are agreeing that what consciousness is like is substrate
>>>> dependent, at the elemental level.
>>>> Elemental greenness is not like elemental redness, and the word 'red'
>>>> is not like either one, even though all 3 can represent 'red'
>>>> information sufficiently for the system to tell you the strawberry is red.
>>>>
>>>
>>> No, I don’t think consciousness can be tied to any particular substrate
>>> or structure. I agree that greenness is different to redness and I agree
>>> that the word “red” is not like either one. I also think that glutamate and
>>> electronic circuits are unlike any qualia, they are different categories of
>>> things. If the system can tell that something is red that does not mean
>>> that it has redness qualia. A blind person can use an instrument to tell
>>> you that a strawberry is red. However, a blind person with an instrument is
>>> not functionally identical to someone with normal vision, since the blind
>>> man will readily tell you that he can’t see the strawberry, an obvious
>>> functional difference.
>>>
>>>>
>>>> On Mon, May 9, 2022 at 7:59 PM Stathis Papaioannou via extropy-chat <
>>>> extropy-chat at lists.extropy.org> wrote:
>>>>
>>>>>
>>>>>
>>>>> On Tue, 10 May 2022 at 11:00, Brent Allsop via extropy-chat <
>>>>> extropy-chat at lists.extropy.org> wrote:
>>>>>
>>>>>>
>>>>>>
>>>>>> On Mon, May 9, 2022 at 6:49 PM Stathis Papaioannou via extropy-chat <
>>>>>> extropy-chat at lists.extropy.org> wrote:
>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Tue, 10 May 2022 at 09:04, Brent Allsop via extropy-chat <
>>>>>>> extropy-chat at lists.extropy.org> wrote:
>>>>>>>
>>>>>>>>
>>>>>>>> Redness isn't about the black box functionality, redness is about
>>>>>>>> how the black box achieves the functionality.
>>>>>>>>
>>>>>>>
>>>>>>> That may seem plausible, but the functionalist position is that
>>>>>>> however the functionality is achieved, redness will be preserved.
>>>>>>>
>>>>>>
>>>>>> In other words, you are not understanding where I pointed out, below,
>>>>>> that you can't achieve redness via functionality, either, according to this
>>>>>> argument.
>>>>>> So, why do you accept your substitution argument against substrate
>>>>>> dependence, but not the same substitution argument for why redness can't
>>>>>> superveen on function, as you claim, either?
>>>>>>
>>>>>
>>>>> The function that must be preserved in order to preserve the qualia is
>>>>> ultimately the behaviour presented to the environment. Obviously if you
>>>>> swap living tissue for electronic circuits the function of the new
>>>>> components is different.
>>>>>
>>>>>
>>>>>> On Mon, May 9, 2022 at 4:53 PM Brent Allsop <brent.allsop at gmail.com>
>>>>>>>> wrote:
>>>>>>>>
>>>>>>>>>
>>>>>>>>> Exactly, I couldn't have set it better myself.
>>>>>>>>> Those aren't reasons why redness isn't substrate dependent.
>>>>>>>>> If you ask the system: "How do you do your sorting, one system
>>>>>>>>> must be able to say "bubble sort" and the other must be able to say: "quick
>>>>>>>>> sort"
>>>>>>>>> Just the same as if you asked: "What is redness like for you, one
>>>>>>>>> would say your redness, the other would say your greenness."
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Mon, May 9, 2022 at 4:51 PM Adrian Tymes via extropy-chat <
>>>>>>>>> extropy-chat at lists.extropy.org> wrote:
>>>>>>>>>
>>>>>>>>>> Those are not reasons why redness can't supervene.
>>>>>>>>>>
>>>>>>>>>> On Mon, May 9, 2022 at 3:42 PM Brent Allsop via extropy-chat <
>>>>>>>>>> extropy-chat at lists.extropy.org> wrote:
>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> OK, let me explain in more detail.
>>>>>>>>>>> Redness can't supervene on a function, because you can
>>>>>>>>>>> substitute the function (say bubble sort) with some other function (quick
>>>>>>>>>>> sort)
>>>>>>>>>>> So redness can't supervene on a function, either because "you
>>>>>>>>>>> replace a part (or function) of the brain with a black box that affects the
>>>>>>>>>>> rest of the brain in the same way as the original, the subject must behave
>>>>>>>>>>> the same"
>>>>>>>>>>> So redness can't superven on any function.
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Mon, May 9, 2022 at 4:03 PM Stathis Papaioannou via
>>>>>>>>>>> extropy-chat <extropy-chat at lists.extropy.org> wrote:
>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Tue, 10 May 2022 at 07:55, Brent Allsop via extropy-chat <
>>>>>>>>>>>> extropy-chat at lists.extropy.org> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> Hi Stathis,
>>>>>>>>>>>>>
>>>>>>>>>>>>> OK, let me try saying it this way.
>>>>>>>>>>>>> You use the neural substitution argument to "prove"
>>>>>>>>>>>>> redness cannot be substrate dependent.
>>>>>>>>>>>>> Then you conclude that redness "supervenes" on some function.
>>>>>>>>>>>>> The problem is, you can prove that redness can't "supervene"
>>>>>>>>>>>>> on any function, via the same neural substitution proof.
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> It supervenes on any substrate that preserves the redness
>>>>>>>>>>>> behaviour. In other words, if you replace a part of the brain with a black
>>>>>>>>>>>> box that affects the rest of the brain in the same way as the original, the
>>>>>>>>>>>> subject must behave the same and must have the same qualia. It doesn’t
>>>>>>>>>>>> matter what’s in the black box.
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>> On Thu, May 5, 2022 at 6:02 PM Stathis Papaioannou via
>>>>>>>>>>>>> extropy-chat <extropy-chat at lists.extropy.org> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Fri, 6 May 2022 at 07:47, Brent Allsop via extropy-chat <
>>>>>>>>>>>>>> extropy-chat at lists.extropy.org> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Hi Stathis,
>>>>>>>>>>>>>>> On Thu, May 5, 2022 at 1:00 PM Stathis Papaioannou via
>>>>>>>>>>>>>>> extropy-chat <extropy-chat at lists.extropy.org> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Fri, 6 May 2022 at 02:36, Brent Allsop via extropy-chat <
>>>>>>>>>>>>>>>> extropy-chat at lists.extropy.org> wrote:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> On Wed, May 4, 2022 at 6:49 PM Stathis Papaioannou via
>>>>>>>>>>>>>>>>> extropy-chat <extropy-chat at lists.extropy.org> wrote:
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> I think colourness qualities are what the human behaviour
>>>>>>>>>>>>>>>>>> associated with distinguishing between colours, describing them, reacting
>>>>>>>>>>>>>>>>>> to them emotionally etc. is seen from inside the system. If you make a
>>>>>>>>>>>>>>>>>> physical change to the system and perfectly reproduce this behaviour, you
>>>>>>>>>>>>>>>>>> will also necessarily perfectly reproduce the colourness qualities.
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> An abstract description of the behavior of redness can
>>>>>>>>>>>>>>>>> perfectly capture 100% of the behavior, one to one, isomorphically
>>>>>>>>>>>>>>>>> perfectly modeled.  Are you saying that since you abstractly reproduce 100%
>>>>>>>>>>>>>>>>> of the behavior, that you have duplicated the quale?
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Here is where I end up misquoting you because I don’t
>>>>>>>>>>>>>>>> understand what exactly you mean by “abstract description”.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> [image: 3_robots_tiny.png]
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> This is the best possible illustration of what I mean by
>>>>>>>>>>>>>>> abstract vs intrinsic physical qualities.
>>>>>>>>>>>>>>> The first two represent knowledge with two different
>>>>>>>>>>>>>>> intrinsic physical qualities, redness and greenness.
>>>>>>>>>>>>>>> "Red" is just an abstract word, composed of strings of ones
>>>>>>>>>>>>>>> and zeros.  You can't know what it means, by design, without a dictionary.
>>>>>>>>>>>>>>> The redness intrinsic quality your brain uses to represent
>>>>>>>>>>>>>>> knowledge of 'red' things, is your definition for the abstract word 'red'.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> But the specific example I have used is that if you
>>>>>>>>>>>>>>>> perfectly reproduce the physical effect of glutamate on the rest of the
>>>>>>>>>>>>>>>> brain using a different substrate, and glutamate is involved in redness
>>>>>>>>>>>>>>>> qualia, then you necessarily also reproduce the redness qualia. This is
>>>>>>>>>>>>>>>> because if it were not so, it would be possible to grossly change the
>>>>>>>>>>>>>>>> qualia without the subject noticing any change, which is absurd.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> I think the confusion comes in the different ways we think
>>>>>>>>>>>>>>> about this:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> "the physical effect of glutamate on the rest of the brain
>>>>>>>>>>>>>>> using a different substrate"
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Everything in your model seems to be based on this kind of
>>>>>>>>>>>>>>> "cause and effect" or "interpretations of interpretations".  I think of
>>>>>>>>>>>>>>> things in a different way.
>>>>>>>>>>>>>>> I would emagine you would say that the causal properties of
>>>>>>>>>>>>>>> glutamate or redness would result in someone saying: "That is red."
>>>>>>>>>>>>>>> However, to me, the redness quality, alone, isn't the cause
>>>>>>>>>>>>>>> of someone saying: "That is Red", as someone could lie, and say: "That is
>>>>>>>>>>>>>>> Green", proving the redness isn't the only cause of what the person is
>>>>>>>>>>>>>>> saying.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> The causal properties of the glutamate are basically the
>>>>>>>>>>>>>> properties that cause motion in other parts of the system. Consider a
>>>>>>>>>>>>>> glutamate molecule as a part of a clockwork mechanism. If you remove the
>>>>>>>>>>>>>> glutamate molecule, you will disrupt the movement of the entire clockwork
>>>>>>>>>>>>>> mechanism. But if you replace it with a different molecule that has similar
>>>>>>>>>>>>>> physical properties, the rest of the clockwork mechanism will continue
>>>>>>>>>>>>>> functioning the same. Not all of the physical properties are relevant, and
>>>>>>>>>>>>>> they only have to be replicated to within a certain tolerance.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> The computational system, and the way the knowledge is
>>>>>>>>>>>>>>> consciousnessly represented is different from simple cause and effect.
>>>>>>>>>>>>>>> The entire system is aware of all of the intrinsic qualities
>>>>>>>>>>>>>>> of each of the pixels on the surface of the strawberry (along with any
>>>>>>>>>>>>>>> reasoning for why it would lie or not)
>>>>>>>>>>>>>>> And it is because of this composite awareness, that is the
>>>>>>>>>>>>>>> cause of the system choosing to say: "that is red", or choose to lie in
>>>>>>>>>>>>>>> some way.
>>>>>>>>>>>>>>> It is the entire composit 'free will system' that is the
>>>>>>>>>>>>>>> initial cause of someone choosing to say something, not any single quality
>>>>>>>>>>>>>>> like the redness of a single pixel.
>>>>>>>>>>>>>>> For you, everything is just a chain of causes and effects,
>>>>>>>>>>>>>>> no composite awareness and no composit free will system involved.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> I am proposing that the awareness of the system be completely
>>>>>>>>>>>>>> ignored, and only the relevant physical properties be replicated. If this
>>>>>>>>>>>>>> is done, then whether you like it or not, the awareness of the system will
>>>>>>>>>>>>>> also be replicated. It’s impossible to do one without the other.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> If I recall correctly, you admit that the quality of your
>>>>>>>>>>>>>>> conscious knowledge is dependent on the particular quality of your redness,
>>>>>>>>>>>>>>> so qualia can be thought of as a substrate, on which the quality of your
>>>>>>>>>>>>>>> consciousness is dependent, right?  If you are only focusing on a different
>>>>>>>>>>>>>>> substrate being able to produce the same 'redness behavior' then all you
>>>>>>>>>>>>>>> are doing is making two contradictory assumptions.  If you take that
>>>>>>>>>>>>>>> assumption, then you can prove that nothing, even a redness function can't
>>>>>>>>>>>>>>> have redness, for the same reason.  There must be something that is
>>>>>>>>>>>>>>> redness, and the system must be able to know when redness changes to
>>>>>>>>>>>>>>> anything else.  All you are saying is that nothing can do that.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> I am saying that redness is not a substrate, but it
>>>>>>>>>>>>>> supervenes on a certain type of behaviour, regardless of the substrate of
>>>>>>>>>>>>>> its implementation. This allows the system to know when the redness changes
>>>>>>>>>>>>>> to something else, since the behaviour on which the redness supervenes
>>>>>>>>>>>>>> would change to a different behaviour on which different colour qualia
>>>>>>>>>>>>>> supervene.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> That is why I constantly ask you what could be responsible
>>>>>>>>>>>>>>> for redness.  Because whatever you say that is, I could use your same
>>>>>>>>>>>>>>> argument and say it can't be that, either.
>>>>>>>>>>>>>>> If you could describe to me what redness could be, this
>>>>>>>>>>>>>>> would falsify my camp, and I would jump to the functionalist camp.  But
>>>>>>>>>>>>>>> that is impossible, because all your so-called proof is claiming, is that
>>>>>>>>>>>>>>> nothing can be redness.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> If it isn't glutamate that has the redness quality, what
>>>>>>>>>>>>>>> can?  Nothing you say will work, because of your so-called proof.  Because
>>>>>>>>>>>>>>> when you have contradictory assumptions you can prove all claims to be both
>>>>>>>>>>>>>>> true and false, which has no utility.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Glutamate doesn’t have the redness quality, but glutamate or
>>>>>>>>>>>>>> something that functions like glutamate is necessary to produce the redness
>>>>>>>>>>>>>> quality. We know this because it is what we observe: certain brain
>>>>>>>>>>>>>> structures are needed in order to have certain experiences. We know that it
>>>>>>>>>>>>>> can’t be substrate specific because then we could grossly change the qualia
>>>>>>>>>>>>>> without the subject noticing, which is absurd, meaning there is no
>>>>>>>>>>>>>> difference between having and not having qualia.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Until you can provide some falsifiable example of what could
>>>>>>>>>>>>>>> be responsible for your redness quality, further conversation seems to be a
>>>>>>>>>>>>>>> waste.  Because my assertion is that given your assumptions NOTHING will
>>>>>>>>>>>>>>> work, and until you falsify that, with at least one example possibility,
>>>>>>>>>>>>>>> why go on with this contradictory assumption where qualitative
>>>>>>>>>>>>>>> consciousness, based on substrates like redness and greenness, simply isn't
>>>>>>>>>>>>>>> possible?
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> _______________________________________________
>>>>>>>>>>>>>>> extropy-chat mailing list
>>>>>>>>>>>>>>> extropy-chat at lists.extropy.org
>>>>>>>>>>>>>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>> --
>>>>>>>>>>>>>> Stathis Papaioannou
>>>>>>>>>>>>>> _______________________________________________
>>>>>>>>>>>>>> extropy-chat mailing list
>>>>>>>>>>>>>> extropy-chat at lists.extropy.org
>>>>>>>>>>>>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>>>>>>>>>>>>
>>>>>>>>>>>>> _______________________________________________
>>>>>>>>>>>>> extropy-chat mailing list
>>>>>>>>>>>>> extropy-chat at lists.extropy.org
>>>>>>>>>>>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>>>>>>>>>>>
>>>>>>>>>>>> --
>>>>>>>>>>>> Stathis Papaioannou
>>>>>>>>>>>> _______________________________________________
>>>>>>>>>>>> extropy-chat mailing list
>>>>>>>>>>>> extropy-chat at lists.extropy.org
>>>>>>>>>>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>>>>>>>>>>
>>>>>>>>>>> _______________________________________________
>>>>>>>>>>> extropy-chat mailing list
>>>>>>>>>>> extropy-chat at lists.extropy.org
>>>>>>>>>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>>>>>>>>>
>>>>>>>>>> _______________________________________________
>>>>>>>>>> extropy-chat mailing list
>>>>>>>>>> extropy-chat at lists.extropy.org
>>>>>>>>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>>>>>>>>
>>>>>>>>> _______________________________________________
>>>>>>>> extropy-chat mailing list
>>>>>>>> extropy-chat at lists.extropy.org
>>>>>>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>>>>>>
>>>>>>> --
>>>>>>> Stathis Papaioannou
>>>>>>> _______________________________________________
>>>>>>> extropy-chat mailing list
>>>>>>> extropy-chat at lists.extropy.org
>>>>>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>>>>>
>>>>>> _______________________________________________
>>>>>> extropy-chat mailing list
>>>>>> extropy-chat at lists.extropy.org
>>>>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> Stathis Papaioannou
>>>>>
>>>>>
>>>>> <https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail> Virus-free.
>>>>> www.avast.com
>>>>> <https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail>
>>>>> <#m_7634101451313130792_m_-3714099383540579201_m_924314508426026969_m_-8755855459092372129_m_9177929776750035119_m_1920751299084759755_DAB4FAD8-2DD7-40BB-A1B8-4E2AA1F9FDF2>
>>>>> --
>>>>> Stathis Papaioannou
>>>>> _______________________________________________
>>>>> extropy-chat mailing list
>>>>> extropy-chat at lists.extropy.org
>>>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>>>
>>>> _______________________________________________
>>>> extropy-chat mailing list
>>>> extropy-chat at lists.extropy.org
>>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>>
>>> --
>>> Stathis Papaioannou
>>> _______________________________________________
>>> extropy-chat mailing list
>>> extropy-chat at lists.extropy.org
>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>
>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>
>
>
> --
> Stathis Papaioannou
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20220510/bcdf91ee/attachment-0001.htm>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: 3_robots_tiny.png
Type: image/png
Size: 26214 bytes
Desc: not available
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20220510/bcdf91ee/attachment-0001.png>


More information about the extropy-chat mailing list