[ExI] Symbol Grounding

Stuart LaForge avant at sollegro.com
Sun Apr 23 22:42:35 UTC 2023

Quoting Brent Allsop via extropy-chat <extropy-chat at lists.extropy.org>:

> This is so frustrating.  I'm asking a simple, elementary school level
> question.

So you think that the Hard Problem of Consciousness reframed as a your  
so-called "Colorness Problem" is an elementary school level question?  
Then maybe you should quit bugging us about it and seek the advice of  
elementary school children.

> I have friends not practiced in theories of consciousness, who
> get this immediately.  Yet everyone here (except Gordon?)  just either
> changes the subject completely (to the unrelated topic of what is required
> for intelligence), or asks yet more questions, as you are doing here.

Once again, if your friends have all the answers, then why are you  
constantly bringing it up on this list? You keep seeking affirmation  
from people who fundamentally disagree with you, and you think that we  
might relent if you keep telling us the same thing a million times?  
That is not the method of science or philosophy, that is instead the  
hallmark of propaganda.

> After I sent that first version, I realized it might have better revealed
> the issue to have stated it like this:
> How is it that a bunch of abstract words, like 'red', floating around
> inside a chatbot's brain can end up with [image: green_border.png]

The only way that can happen is if at some point during the chatbot's  
training the English word red was associated with the picture in  

> Instead of answers, all I get are "It's turtles all the way down."  Or
> infinitely recursive questions as answers, or "Let's completely ignore that
> 'hard'* question and change the subject to an 'easy'* question like what is
> required for something to be intelligent?"

In order for people to indulge your questions, you need to be prepared  
to accept their answers, even if you disagree with them. And when  
people do disagree with you, that does not obligate them to write  
content for your commercial website in the form of a "competing camp".

> The properties of things are observable or demonstratable physical facts
> about those things.  I'm simply asking what are [image:
> red_border.png] and [image:
> green_border.png] properties of?

I would say those files are properties of whatever computer is hosting them.

> If they are properties of physical things, what are those physical things?

Those files are composed of information which is physical but, also  
intangible. Which is very likely how "redness" is encoded  in your  
head, since "redness" or other qualia in general are also intangible.

> (Even an 8 year old knows how to find out what color things are)
> If they are properties of different functions, what functions?  (How would
> you test for this?)

When you say "they" are you still referring to your image files of  
colored squares? If so you could write a Python function that would  
accept your colored square as input and then output what color it is.  
You could test it by executing the code.

> If they are properties of different relationships, what relationships?
> (How would you test for this?)

Color perception could be thought of as a property of relationships  
between wavelengths of light and brain states. The test for this is to  
ask somebody blind from birth what their favorite color is. If they  
don't have an opinion, then that demonstrates that "colorness" has is  
related to light and brains states.

> If they are properties of some spiritual realm, what/where is that?  (How
> would you test for this?)

We would have to find a spiritual realm amenable to empirical methods  
to test this. I have some ideas on this, but it is too premature to  
discuss this now.

> If they are properties of turtles all the way down, what is the difference
> between  [image: red_border.png] turtles all the way down and [image:
> green_border.png] turtles all the way down? (How would you test for this?)

You are taking a metaphor about turtles all the way down referring to  
recursion a little too literally here. There are no turtles in any  
sense that would be meaningful to discuss their properties here.

> ......
> * Chalmers' classifications of what he incorrectly thinks is a 'hard
> problem' vs what is an 'easy problem.'
> In reality, his so-called "hard problem" is the most trivially easy
> problem, one of the first things we learned in elementary school.  It is
> simply: "What is  [image: red_border.png]  a property of?"
> Everyone making it too "hard" is the only problem.

Again, reframing Chalmer's hard problem as a color problem does not  
make it any easier. If you think you think that some computationally  
bound molecules such as glutamate are the secret of qualia, then well  
good luck with that.

Stuart LaForge

More information about the extropy-chat mailing list