[ExI] LLM's cannot be concious

Jason Resch jasonresch at gmail.com
Thu Mar 23 18:09:40 UTC 2023


On Thu, Mar 23, 2023, 12:09 PM Adrian Tymes via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> Gordon's objection is at a more basic level, if I understand it correctly.
>
> On Wed, Mar 22, 2023 at 7:11 PM Jason Resch via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> On Wed, Mar 22, 2023, 8:23 PM Gordon Swobe <gordon.swobe at gmail.com>
>> wrote:
>>
>>> On Tue, Mar 21, 2023 at 6:43 AM Jason Resch <jasonresch at gmail.com>
>>> wrote:
>>>
>>>> I address this elsewhere in the thread. A sufficient intelligence given
>>>> only a dictionary, could eventually decode it's meaning. I provided an
>>>> example of how it could be done.
>>>>
>>>
>>> I saw that, and I disagree. I think if you try to work out an example in
>>> your head, you will see that it leads to an infinite regression, an endless
>>> search for meaning. Like ChatGPT, you will learn which word symbols define
>>> each other word symbol, and you learn the rules of language (the syntax),
>>> but from the dictionary alone you will never learn the actual meaning of
>>> the words (the referents).
>>>
>>> Try it with any word you please. You rapidly have a massive list of
>>> words for which you have no meaning and for which you much keep looking up
>>> definitions finding more words for which you have no meaning, and in your
>>> list you also have many common words (like "the" and "a") that lead to
>>> endless loops in your search for meaning.
>>>
>>
>> I see the word "Pi" defined by a string of 20 symbols which if I
>> interpret them to be digits in base 10 I confirm to be the ratio of a
>> circle's circumference to its diameter. This not only tells me about the
>> number system used in the dictionary but also what each digit means.
>>
>
> What are "digit", "base 10", "ratio", "circle", "circumference",
> "diameter", and "number system"?
>
>
>> I count 92 entries with the string "chemical element" in their
>> definition. X number of which have the string "radioactive" and the other
>> (92-X) have the word "stable". I confirm these must be the 92 naturally
>> occurring elements and the atomic numbers listed in the definition tell me
>> the names of each of the elements.
>>
>
> What are "radioactive", "stable", "naturally occurring elements", and
> "atomic numbers"?
>
>
>> I find an entry that includes "H2O aka dihydrogen monoxide" under the
>> entry "water". I know that this is the word used to refer to the compound
>> composed of one atom of oxygen bound to two elements of hydrogen.
>>
>
> You know this.  An AI would not necessarily start with this knowledge.
>
> And so on.  The basic objection is: if you start from literally no
> knowledge of the language other than "this word often goes with that in
> this way", how do you generate that first little bit of knowledge from
> which you can extrapolate the rest?
>
> Let us take for example Japanese, Russian, or some other language you
> don't know that is not based on the grammar and syntax of a language that
> you do know.  You have access to lots of material written in that language,
> but no access to translators or any other oracle that can tell you what any
> of those words mean in languages you know.
>
> If this sounds familiar from science fiction, it is part of the classic
> "first contact" scenario.  Let it be said, this is a solved problem for
> humans - but the ways in which they communicated those first meanings, that
> linked basic concepts to words, are not necessarily available for AIs, nor
> can ChatGPT et al necessarily be programmed with knowledge of a few words.
> (As most people who have ever written code know, dismissing things as "a
> simple matter of programming" means "I have no idea how to do this".)  So
> how do AIs get over this hump?
>


Take all the neural impulses from the sense organs a human brain receives
from birth to age 25 as a huge list of tuples in the format: (neuron id,
time-stamp). This is ultimately just a list of numbers. But present in
these numbers exists the capacity for a brain to learn and know everything
a 25-year-old comes to learn and know about the world. If a human brain can
do this from this kind of raw, untagged, "referentless" data alone, then
why can't a machine? I've raised this point multiple times in my replies,
but have yet to have anyone take me up on explaining why it's impossible
for an AI when it is clearly possible for the human brain.

Jason
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230323/d7816ed0/attachment.htm>


More information about the extropy-chat mailing list