[ExI] Do digital computers feel?
Brent Allsop
brent.allsop at gmail.com
Mon Feb 6 04:14:31 UTC 2017
Hi Stathis,
Does this help?
When you talk about "*only observable behaviour*" you are assuming a
definition of "observe" that is completely qualia blind. There isn't
something special about qualia, but there is something qualitative,
which can't be observed by simple "*only observable behavior*". You can
detect and represent qualia with any physical behavior you want, but you
can't know what an abstracted representation of what you have detected
qualitatively represents unless you know how to interpret that behavior
back to the real thing. In order to include the qualities of conscious
experience into a definition of observation, you must provide a
definition of observe that properly qualitative interpretation of any
abstracted representations into whatever it is that has a redness
quality being observed.
I imagine a simple-minded engineer working to design a perfect glutamate
detection system that can't be fooled into giving a false response by
any other not glutamate substance or system. It is certainly possible
that some complex set of functions or physical behavior, like glutamate,
is one and the same as something we can experience as a complex redness,
right and that nothing else will have the same physical function or quality?
Once your simple minded engineered glutamate detector is working, it
will never find anything that is glutamate in the rods and wires
engineered to simulate glutamate in an abstracted way. Also, without
having the correct translation hardware, you will not be able to
interpret any abstracted representations of glutamate (redness), as if
it was glutamate (redness), let alone, think it is real glutamate (real
redness).
And of course, when you neural substitute the glutamate and its
detector, out for some simulation of the same, the neural substitutuion
fallacy should be obvious. It will only work when you completely swap
out the entire detection system with something else that knows how to
properly interpret that which is not glutamate, as if it was
representing it. Only then will it be *observably the same behavior*.
But nobody will claim that your simulation has any real glutamate being
used for representations of knowledge - glutamate being something that
physically functions identically with glutamate (or redness) without any
hardware interpretation mechanism being required.
Brent
On 2/3/2017 9:22 PM, Stathis Papaioannou wrote:
> I don't see why it's obviously fallacious or obvious. You won't engage
> with what is the relatively simple question of *observable behaviour*.
> Humans have moving parts: molecules, ion currents, ultimately bones
> which are pulled by tendons connected to muscles which are controlled
> by nerves. Consider just these mechanical processes. Do you agree that
> they can be replicated using alternative materials and devices, for
> example tiny electric motors in place of actin-myosin in the process
> of exocytosis whereby neurotransmitters are released into the synapse,
> titanium rods in place of bones, artificial isotopes of potassium and
> sodium? Or do you think there is some theoretical reason (not just a
> practical, engineering reason) why this can't be done with particular
> components of a bilogical system - and if so, what is it that makes
> those components special? Please answer this considering . Imagine you
> are a simple-minded engineer who has no idea about consciousness and
> your job is *only* to examine the part of the body you are assigned
> and design a replacement part using various electrical and mechanical
> nanocomponents.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20170205/73d291fd/attachment.html>
More information about the extropy-chat
mailing list