[ExI] Can philosophers produce scientific knowledge?

Jason Resch jasonresch at gmail.com
Sun May 9 14:05:22 UTC 2021


On Sun, May 9, 2021, 6:31 AM Brent Allsop via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

>
> Hi Jason,
> Thanks for jumping in, fun to have another participant, and thanks,
> Stathis, for posting this here, as evidently I would have missed it?  I
> wouldn't have wanted to miss it, so wondering why I did.
>
> On Sat, May 8, 2021 at 3:39 PM Stathis Papaioannou via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>>
>>
>> On Sun, 9 May 2021 at 03:43, Jason Resch via extropy-chat <
>> extropy-chat at lists.extropy.org> wrote:
>>
>>> To support Stathis's position:
>>
>> Functionalism requires 2 things:
>> 1. that the physics used by the brain is computable
>> 2. That nothing in the brain requires an infinite amount of information
>>
>
> this entire post is a completely qualia blind, especially this list of
> requirements.  In other words, you are missing THE most important
> qualitative nature of consciousness and how we represent things like
> knowledge of colors.
>


I acknowledge that my post did not address qualia. Allow me to do so I
this post.

If the "bio-brain" is conscious, and the "compu-brain" preserves all the
relevant interrelationships of the bio-brain in an isomorphic manner, then
all externally visible behavior will likewise be the same.

The person with a compu-brain will still cry when in pain, still say
there's an incommunicable difference between red and green, still describe
their dull back ache in full (and identical) detail to the person with the
bio-brain. If based on the brain of Chalmers or Dennett, the compu-brain
will even still write books in the mysteries of consciousness and qualia.

In short, there's would be no objective or scientific test you could do to
rule out the consciousness of the compu-brain, as all objective behaviors
are identical.

Although you could reason that "if philosphical zombies are logically
impossible" then "identically functioning compu-brains must be conscious,
in the same ways as bio-brains are conscious."

I see no rational basis for an assumption that the compu-brain is not
consciousness or is differently conscious. But there are rational bases for
assuming they must be the same (e.g. dancing/fading qualia, self-reports,
non-dualism, non-epiphenomenalism, successes of neural prosthesis, the
anti-zombie principle).


> 3. There must be something that is responsible for each of the intrinsic
> qualities of each elemental piece of conscious knowledge, and you must be
> able to observe these computational differences.
>

Are you speaking from a first person or third person viewpoint when you say
you must be able to observer computational differences?

I would say that there are many different ways one could write an
equivalent program/function, so it might not always be obvious in a third
person view when a different computation results in different
consciousness. This is an aspect of all functionalist approaches, it
results in a feature (or bug) called "multiple realizability".

The problem arises even in bio-brains. A dolphin has different brain
structures from humans, but most would admit both dolphins and humans can
feel pain, despite these differences in their brains. So two different
brain states result in similar conscious states.


> For example, by observe, if there is one pixels of visual knowledge
> switching from redness to greenness,  and nothing else about you conscious
> state is changing,  You of course must be able to directly apprehend the
> qualitative changes in that pixel, and also must be able to objectively
> observe from afar, whatever it is in your brain responsible for that
> subjective change in experience.  And you must be able to do it in a way so
> that you can tell if two people you are objectively observing have been
> engineered to have inverted red green qualia, as depicted in this image
> from Wikipedia.
>
>
>
>
>
>

I agree that inverted qualia could only come about through functionally
different organizations of the brain. The idea that you could flip a
metaphysical switch and invert someone's qualia I think makes the same
error as assuming zombies are possible.

Jason
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20210509/067b1263/attachment.htm>


More information about the extropy-chat mailing list