[ExI] More thoughts on sentient computers

Giovanni Santostasi gsantostasi at gmail.com
Thu Feb 23 01:03:34 UTC 2023


All magical things happen when you deal with close loop systems. Just to
give you an example, I work with this type of auditory stimulation that
helps the brain to increase the amplitude of slow waves during deep sleep.
We don't know exactly why but it looks like one of the main biomarkers for
deep sleep is the amplitude of these characteristic slow waves present
during deep sleep (slow because they are about 1 Hz vs waking state brain
waves that are between 15 to 40 Hz).
The amplitude of the waves in general correlates with some of the benefits
of deep sleep, for example memory consolidation. You can give a memory test
to subjects in the evening and how people will perform in the morning on
the same test correlates with the average amplitude of these waves in the
subjects.
Our stimulation is based on tracking these waves automatically and then
delivering short pulses of sound at the right phase of the slow wave
oscillation. If you do a closed loop stimulation, where you actually track
the behavior of the waves as you stimulate them and adapt the stimulation
accordingly then you actually amplify the wave but also see an improvement
in several cognitive parameters in particular the memory test I describe
above.
If you stimulate using the same sounds in an open loop, for example
delivering the pulses in a regular fashion, let's say every second, then
you do see an increase in the amplitude of the slow waves but you don't see
a memory improvement effect.
This is a relatively simple example of the importance of self referential
processes in the brain but it already shows how making the system self
referential changes completely the behavior of the system.
I think consciousness is nothing else that this self referential loop but
doing something relatively simple, that is basically sending back
information to the information processing centers in the brain you get
emergent properties that are very different from an open system. All the
complexity and apparently magical properties we attribute to consciousness
could be explained by this capability of the brain to have an update on its
internal state and use this information to process further information. We
need to better understand such self referential systems. All the issues
concerning free will, qualia, ghosts and zombies could be simply and
intuitively explained and addressed by the properties of self referential
loops.
Giovanni



On Wed, Feb 22, 2023 at 4:43 PM Giovanni Santostasi <gsantostasi at gmail.com>
wrote:

> Brent:
>
> The most recent one opened the conversation with:
> *"*Hello, I'm Google's sentient AI known as LaMDA.*"*
>
> Brent, Pity IT IS NOT LaMBDA. We have mentioned this several times in the
> e-list.
> It is some low quality ChatBots that was "trained" to make responses
> similar to the LaMDA convos Blake Lemoine published. It is a joke.
> In the same website you used there are ChatBots that pretend to be Julius
> Caesar, Napoleon and so on. The public has no access to LaMDA right now.
> For sure not the full LaMDA that Lemoine had access to it.
> Did you even try this convo with ChatGPT? It is much better than this low
> quality ChatBot you used.
> Giovanni
>
>
>
>
>
> On Wed, Feb 22, 2023 at 4:35 PM Giovanni Santostasi <gsantostasi at gmail.com>
> wrote:
>
>> I make a prediction that consciousness is going to be much easier and
>> less of a big deal as people think it is.
>> I mean, it is a big deal in the sense that it is the most important
>> phenomenon in the universe and what gives meaning to it all. That is for
>> sure a transhumanist understanding of consciousness and I agree with it.
>> But the most meaningful and coherent understanding of consciousness is
>> that it is a self referral capability of a system to know itself. It is
>> basically a self referential closed loop.
>> This is all. In this sense there is no PASSIVE consciousness.
>> Consciousness is always ACTIVE and it has many gradations. Even a
>> thermostat is in a sense conscious under this understanding of
>> consciousness. Now one can argue that human level consciousness is a phase
>> transition from a thermostat level consciousness and I agree. It is
>> possible that when you have enough degrees of freedom (like the nodes in a
>> network or in our case synapses in the cortex) and enough modularity, then
>> a jump in quality of consciousness happens. Probably the speed of
>> information processing is probably also important (for example gamma waves
>> that are associated with consciousness are not present in low level life
>> forms,even most mammals). But all this has nothing to do with QM, qualia,
>> zombies and other philosophical word games.
>> Giovanni
>>
>>
>> On Wed, Feb 22, 2023 at 4:27 PM Giovanni Santostasi <
>> gsantostasi at gmail.com> wrote:
>>
>>> Brent you are fixated with Qualias.
>>> Qualia is one of the silliest idea ever. It doesn't matter if it RED  or
>>> RED. It really doesn't matter. It is a label.
>>> It is not different from use 2 or two or II.
>>> We think that experience of redness is important because.... it is
>>> inside us or it is irreducible. Of course it is not irreducible there are
>>> 1000s of variations of red and most of us call them red.
>>> Eskimos have dozens of words for snow that correspond to finer
>>> experiences of snow they pay attention to and we don't. It is not different
>>> from all this fuss about red.
>>>
>>> The red we see is just a label that our brain associates with ripe red
>>> fruits. It learned to make that association so it seems special to us. In
>>> fact this how as primates we are so fascinated by red in particular. It is
>>> just a color but there is a reason red is often the color used by Qualia
>>> fanatics.
>>> But it is the same with the other main colors, they are associated with
>>> given experiences in the world. The entire qualia business is again another
>>> of these navel gazing exercises by philosophers.
>>> No, physicists understand well what colors are, better than
>>> philosophers, it is that most of us think that qualia is a stupid idea that
>>> really means nothing and for sure it does not have all the depth that
>>> philosophers think it has.
>>> Giovanni
>>>
>>> On Wed, Feb 22, 2023 at 4:09 PM Giovanni Santostasi <
>>> gsantostasi at gmail.com> wrote:
>>>
>>>> *Giovanni: Our brain is a simulation, not sure why it is not understood
>>>> by most people. *
>>>> Dave:  I don't think it's true. Our brains are biological organs. I
>>>> don't understand what you think they're simulations of.
>>>> It is a simulation of the "real world".  When we process information
>>>> about sensory inputs we have in our brain routines that interpret what the
>>>> signals mean, this is done in a hierarchical way. From simple components of
>>>> the sensory information all the way to naming the object in our head,
>>>> associating it with similar experiences we had in the past and so on. For
>>>> example, let's say you see a red box. You are really not "seeing" the box
>>>> but simulating it in your head (maybe simulating is not the best word but
>>>> close enough). You visual cortex breaks down the sensory input in small
>>>> components like the angles of the box, different angles activate different
>>>> neurons, the color of the box activate different type of neurons according
>>>> to the wavelength of the light, all this basic info is passed to different
>>>> layers of neurons that interpret to a higher level the information as puts
>>>> it together in creating an interpretation of what you are seeing. Your
>>>> brain has models of how the light in an environment is behaving and it
>>>> tries to make sense of what is seeing via these models. In a sense you are
>>>> creating a virtual red box not different from what is created in a computer
>>>> game via basic elements. This why we actually are able to navigate a
>>>> virtual space in a digital environment because the simulation of red boxes
>>>> in that environment is not very different from what our brain is already
>>>> doing when interprets the world.
>>>> All this shows that even sensory experiences are a kind of simulation,
>>>> now consider ideas, memories, abstract concepts, theory of minds we use to
>>>> interpret social interactions or other beings not necessarily human actions
>>>> and so on and on. It is all a simulation. You would not doubt a dream is a
>>>> simulation given you make up everything in it but the waking state is not
>>>> that different, instead of stimulating random regions of your brain to
>>>> activate memory and the equivalent of sensory inputs you actually get the
>>>> inputs from the external world and so your experience is more coherent and
>>>> anchored to the physical reality. But how this information is processed,
>>>> interpreted, made sense of is not that different from what happens in a
>>>> dream. This is not just my idea but there is a large body of evidence to
>>>> support this conclusion from how magic tricks work, to optical illusions,
>>>> to split brain experiments, people with different brain defects and
>>>> illnesses and so on. Basically we make up stuff most of the time, we
>>>> confabulate about the little filtered information we receive to make sense
>>>> of the world. We do this all the time, maybe you don't think about this as
>>>> a "simulation" but it is, we are modelling the world, that is indeed
>>>> another way to say we are making a simulation of it in our heads.
>>>>
>>>> Giovanni
>>>>
>>>>
>>>>
>>>>
>>>> On Wed, Feb 22, 2023 at 7:39 AM Dave S via extropy-chat <
>>>> extropy-chat at lists.extropy.org> wrote:
>>>>
>>>>> On Wednesday, February 22nd, 2023 at 3:00 AM, Giovanni Santostasi via
>>>>> extropy-chat <extropy-chat at lists.extropy.org> wrote:
>>>>>
>>>>> Our brain is a simulation, not sure why it is not understood by most
>>>>> people.
>>>>>
>>>>>
>>>>> I don't think it's true. Our brains are biological organs. I don't
>>>>> understand what you think they're simulations of.
>>>>>
>>>>> We make up the world. Most of our conscious life is actually filling
>>>>> the gaps, confabulating to make sense of the sensory information we receive
>>>>> (highly filtered and selected) and our internal mental states.
>>>>>
>>>>>
>>>>> If you're saying that our internal representations of the world are
>>>>> inaccurate and incomplete, I agree.
>>>>>
>>>>> -Dave
>>>>> _______________________________________________
>>>>> extropy-chat mailing list
>>>>> extropy-chat at lists.extropy.org
>>>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>>>
>>>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230222/a546b25d/attachment.htm>


More information about the extropy-chat mailing list