[ExI] Symbol Grounding
brent.allsop at gmail.com
Wed Apr 26 17:31:52 UTC 2023
OK, let me see if I can summarize what is important in what you are saying.
We all agree that subjective qualities exist (Thank you Giovani, for
explicitly expressing this in your statement I quoted) we are just making
falsifiable predictions about the nature of those qualities.
But help me out with something regarding this functional nature of
qualities. You also said: "consciousness is the verbs not the nouns". I
would say the virb is "pick" as in pick the strawberry. The strawberry is
the object or the noun. I would say the quality is the property of the
noun, which tells us what to pick (the red one) and what not to pick (the
green one). And whether we use a subjective redness property to represent
the red one, or a subjective grenness property to represent, either way, we
can pick the right one. But what does any of that function, have to do
with determining what redness is like? Seems to me, the properties we
represent our knowledge with, is substrate dependent. If you change it
from glutamate to glycine, it is going to be physically different, and even
though both will allow you to pick the correct strawberry (if you have the
correct dictionary), they are still representing the knowledge with
different physical properties. (or different subjective qualities, if you
On Wed, Apr 26, 2023 at 8:50 AM Jason Resch via extropy-chat <
extropy-chat at lists.extropy.org> wrote:
> On Wed, Apr 26, 2023, 8:07 AM Brent Allsop via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>> Hi Henry,
>> Welcome to the conversation, it is good to have other people weigh in on
>> this, as required to better understand how everyone currently thinks about
>> what we are.
>> It took me some time to digest what you are saying. I think I
>> understand, so I want to say it back to you to see if I understand it.
>> First, let me see if I can summarize the primary disagreement in this
>> entire conversation. It now appears there are some things both camps can
>> now agree on, we just have differing values about what is important. I
>> think Giovani captured this brilliantly with:
>> “This is again a demonstration of the validity of the functionalist
>> understanding of brain function. All I [functionalists] care about is
>> the association, not how it feels to have the redness experience but how
>> generalized it is.”
>> So, Henry, you indicated the Perceiving a Strawberry
>> video was thought provoking. Perhaps it got you to realize there are
>> qualities or properties of subjective knowledge, you are just indicating
>> that external consistency in our ability to communicate about the nature of
>> reality out there is more important than any property or type of code any
>> intelligence may be using to represent that knowledge, in their brain.
>> In other words, it seems to me that all the functionalists value is that
>> we can all say: "The Strawberry is Red" (as portrayed in this image) while
>> some of us value the nature of the knowledge inside the brain, which
>> enables us to all say: "The strawberry is red."
>> [image: The_Strawberry_is_Red_064.jpg]
>> Henry, Giovani, and everyone. Does that capture the differences between
>> the substrate independent, and substrate dependent camps?
>> We all agree on the facts portrayed in this image, we are just valuing
>> different parts of it, and some of us want to ignore other parts of it.
> Functionalism doesn't deny the existence of qualia. As far as I know only
> eliminative materialism goes thet far.
> Functionalism is just one among many theories in philosophy of mind that
> attempts to explain what underlies consciousness (and qualia).
> Functionalism says consciousness is the verbs not the nouns, that make a
> mind. A human mind is what the human brain does: it's set of actions and
> behaviors, not what it's constitutional elements happen to be. So long as
> the causal organization between the minds elements is preserved, it makes
> no difference what the elements are or are made of.
> That's all functionalism says.
> Functionalism makes no denials of the reality of consciousness or qualia,
> nor does it make any statements regarding their value.
>> On Tue, Apr 25, 2023 at 9:45 PM Henry Rivera via extropy-chat <
>> extropy-chat at lists.extropy.org> wrote:
>>> I really liked that video about the red strawberries. It's
>>> thought-provoking. I'm curious to get Brent's response. Maybe color is the
>>> wrong simple example to use for communicating about qualia. It worked well
>>> enough until we realized color perception is a subjective contextual
>>> process that did not evolve to favor reflecting (consensus) reality.
>>> Perceived color constancy is more important, that is, has been more
>>> adaptive for us. How about them apples... or strawberries.
>>> To quote my late friend and rapper Sean Byrne: "Nothing exists except
>>> for your perception, the pain of the past only serves as a lesson."
>>> On Mon, Apr 24, 2023 at 7:00 PM Brent Allsop via extropy-chat <
>>> extropy-chat at lists.extropy.org> wrote:
>>>> Hi Jason,
>>>> On Mon, Apr 24, 2023 at 3:09 PM Jason Resch via extropy-chat <
>>>> extropy-chat at lists.extropy.org> wrote:
>>>>> as in say the strawberry is red, but it would answer the question:
>>>>>> "What is redness like for you." differently.
>>>>> I don't see why they would answer this question differently if
>>>>> everything got inverted, including all emotional associations. If you
>>>>> changed only the word, but left the emotional associations as they were,
>>>>> then you could perhaps get different descriptions.
>>>> I'm skipping a bunch of stuff that I think is less important, and
>>>> focusing on what I think is most important, but if I skip over something
>>>> important, don't let me brush over it.
>>>> Giovani, evidently you think even a person engineered to have red /
>>>> green qualia inversion, you would consider them to be indistinguishable,
>>>> and that the quality difference of the subjective knowledge wouldn't matter?
>>>> It sounds like Jason at least thinks the two would be qualitatively
>>>> different, and this difference is important, if you are asking what his
>>>> redness is like for each of them. Jason just has a problem with how we
>>>> would know, or how he would report that. For the moment, can we just say
>>>> we are God, for a bit. And we can know if the redness is now greenness,
>>>> even though the person wouldn't know, since all of his memories and
>>>> references have been remapped.
>>>> The prediction is the future, we will be able to read people's minds,
>>>> and objectively observe whether it is Jason's redness, or Jason's
>>>> greenness, via neural ponytails, or whatever.
>>>> The critically important part is we need to focus on only the important
>>>> thing, the quality of the redness. Not what the person thinks that quality
>>>> is called, whether he is lying or whatever. Let's only focus on the
>>>> quality of the redness experiences. Would God say that quality has changed
>>>> or not, regardless of what the person says.
>>>> So, again, if you engineered someone to be a qualia invert. God could
>>>> honestly tell those two people that one's redness was like the other's
>>>> And even though they would function differently, when asked what is
>>>> redness like for you, they would know, since God told them, that their
>>>> redness was like the other's greenness, so despite them being otherwise
>>>> identical, they were qualitatively different.
>>>> So, would you agree that the quality of their consciousness is
>>>> dependent on what their redness is like, and if one redness quality is like
>>>> the other's greenness, that would be important and objectively observable?
>>>> extropy-chat mailing list
>>>> extropy-chat at lists.extropy.org
>>> extropy-chat mailing list
>>> extropy-chat at lists.extropy.org
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
-------------- next part --------------
An HTML attachment was scrubbed...
-------------- next part --------------
A non-text attachment was scrubbed...
Size: 65130 bytes
Desc: not available
More information about the extropy-chat