[ExI] LLM's cannot be concious

Adrian Tymes atymes at gmail.com
Sat Mar 18 19:24:12 UTC 2023


On Sat, Mar 18, 2023 at 11:42 AM Jason Resch via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> On Sat, Mar 18, 2023, 1:54 PM Adrian Tymes via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> Volition and initiative are essential parts of consciousness.
>>
>
> Are apathetic people not conscious?
>

Apathetic people, if left alone, will still move themselves to feed,
defecate, et al.  Granted this is their minds acting in response to stimuli
from their bodies, but again, such stimulus is outside the realm of a pure
LLM.


>   Mere reacting, as LLMs do, is not consciousness.
>>
>
> All our brains (and our neurons) do is react to stimuli, either generated
> from the environment or within other parts of the brain.
>

Granted.  Part of the difference is the range and nearly-ever-constant
nature of said stimuli, which is missing in things that are only LLMs.
(Again, making that distinction since human minds arguably include LLMs,
the critical difference being that they are more than just LLMs.)


> Volition and initiative requires motivation or goals.  If an entity is not
>> acting in direct response to an external prompt, what is it doing?
>>
>
> Sleeping.
>

A fair point, though a side matter from what I was talking about.  When an
entity that is conscious - which includes not being asleep at that time -
is not acting in direct response to an external prompt, what is it doing?


> If someone were to leave a LLM constantly running
>>
>
> The idea of constantly I think is an illusion. A neuron might wait 1
> millisecond or more between firing. That's 10^40 Planck times of no
> activity. That's an ocean of time of no activity.
>
>  *and* hook it up to sensory input from a robot body, that might overcome
>> this objection.
>>
>
> Sensory input from an eye or ear is little different from sensory input of
> text.
>

Not so, at least in this context.  The realms of difference between full
sight, let alone sound et al, and mere text aside, these sensory inputs of
text only happen when some other entity provides them.  In contrast, a full
sensory suite would obtain sensory data from the environment without
waiting on another entity to provide each specific packet of information.


> Both are ultimately digital signals coming in from the outside to be
> interpreted by a neural network. The source and format of the signal is
> unimportant for its capacity to realize states of consciousness, as
> demonstrated by experiments of Paul Bach-y-Rita on sensory substitution.
>

Substituting usually-present sound/scent/etc. for usually-present sight, or
other such combinations, substitutes some usually-present sensory data for
other usually-present sensory data.  In both cases, the entity is not
usually waiting on someone else to give it its next hit of sensory data.
The source does matter, albeit indirectly.


> But that would no longer be only a LLM, and the claim here is that LLMs
>> (as in, things that are only LLMs) are not conscious.  In other words: a
>> LLM might be part of a conscious entity (one could argue that human minds
>> include a kind of LLM, and that babies learning to speak involves initial
>> training of their LLM) but it by itself is not one.
>>
>
> I think a strong argument can be made that individual parts of our brains
> are independently consciousness. For example, the Wada test shows each
> hemisphere is independently consciousness. It would not surprise me if the
> language processing part of our brains is also conscious in its own right.
>

A fair argument.  My position is that not all such parts are independently
conscious, in particular the language processing part, but that
consciousness is a product of several parts working together.  (I am not
specifying which parts here, just that language processing by itself is
insufficient, since the question at hand is whether a language processing
model by itself is conscious.)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230318/01eb9169/attachment-0001.htm>


More information about the extropy-chat mailing list