[ExI] Is the brain a digital computer?
stathisp at gmail.com
Tue Feb 23 13:52:17 UTC 2010
On 23 February 2010 23:42, Gordon Swobe <gts_2000 at yahoo.com> wrote:
> --- On Mon, 2/22/10, Stathis Papaioannou <stathisp at gmail.com> wrote:
>> You're setting yourself up for the obvious reply: it is
>> possible to make an artificial heart that pumps blood just as well as...
> Your answer misses the point. I'll try again in a different way:
> You have three powerful futuristic digital computers, call them H, S, and B, running on the desk in front of you. On H runs a copy of some futuristic software titled "Simulated Heart v. 1254". On S runs a copy of "Simulated Stomach v. 2434". On B runs a copy of "Simulated Brain v. 0989873".
> On computer H the simulated heart simulates the processing of simulated blood, and you have no objection I assume to my saying the simulated heart running on H doesn't really pump blood.
> On computer S the simulated stomach simulates the processing of simulated food, and you have no objection I assume to my saying the simulated stomach running on S doesn't really process food.
> On computer B the simulated brain simulates the processing of simulated thoughts, but here you do object when I tell you the simulated brain doesn't really think. Here you want to tell me the simulated brain really does think real thoughts.
> How do you explain your inconsistency?
> More precisely, why do you classify "thoughts" in a different category than you do "blood" and "food", if not because you have adopted a dualistic world-view in which mental phenomena fall into a different category than do ordinary material entities?
H' - connect H to a pump so that it pumps blood.
S' - connect S to a motorised cavity so that it processes food.
B' - connect B to cameras, microphones, speakers, electric motors etc.
so that it interacts with its environment in an intelligent way.
H' and S' are not *identical* to a heart or stomach but they perform
the *function* of a heart or stomach. Similarly, B' is not *identical*
to a brain but it performs the function of a brain. You probably don't
even need all the sensors and effectors since a person can still think
if they are paralysed and deprived of sensory input.
You won't claim that H' doesn't "really" pump blood but only pretends
to pump blood. It's obvious that if it pumps blood, it pumps blood. To
some people (especially on this list) it's equally obvious that B'
must be able to think. It's not quite so obvious to me: I entertain
the possibility that perhaps consciousness is different to other
phenomena in the universe and might be separable from the observable
behaviour it seems to underpin. That would mean I could replace part
of my brain with a functionally identical but unconscious analogue,
selectively removing any aspect of my consciousness without noticing
that anything had changed and without displaying any outward change in
behaviour. I believe that is absurd, and this leads me to conclude
that those who immediately saw that B' must be conscious were right.
As far as I have been able to tell, you also agree that becoming a
partial zombie without noticing or showing any outward behavioural
change is absurd, but you still think that it is possible to make
zombie brains or brain components. Several times you have said that
the zombie components would cause the recipient to behave differently,
but this is obviously a contradiction, since you agreed that the
zombie components can be made to behave exactly like biological
components. Philosophical discussions often just fizzle out without
consensus being reached, but when one party claims that both P and ~P
are true I think everyone would agree that they have lost at least
that part of the debate.
More information about the extropy-chat