[ExI] Semiotics and Computability (was: The digital nature of brains)
stathisp at gmail.com
Sun Feb 7 07:09:30 UTC 2010
On 7 February 2010 07:55, Gordon Swobe <gts_2000 at yahoo.com> wrote:
> --- On Fri, 2/5/10, Stathis Papaioannou <stathisp at gmail.com> wrote:
>> I don't deny subjective experience but I deny that when I
>> understand something I do anything more than associate it with another
>> symbol, ultimately grounded in something I have seen in the real
>> world. That would seem necessary and sufficient for understanding, and
>> for the subjective experience of understanding, such as it is.
> When I asked you about a digital computer that did exactly that, you acknowledged that said computer lacked conscious understanding of the symbol and went off on a tangent about amoebas.
> So then it seems that first you say these sorts of associations are necessary and sufficient for subjective experience of understanding, but then you don't.
These sorts of associations are the basic stuff of which understanding
is made, but obviously there are degrees of understanding, involving
complex syntax and multiple associations. A human's understanding,
perception and intelligence stands in relationship to that of a simple
computer system's as it stands in relationship to that of a simple
> re: the amoeba
> As I use the word "consciousness", I believe the amoeba has none whatsoever. This unconscious creature exhibits intelligent behavior but because it has no nervous system I doubt very seriously that it has any conscious experience of living. It looks for food intelligently in the same sense that your watch tells the time intelligently and in the same sense in which weak AI systems may one day have the intelligence needed to pass the Turing test; that is, it has intelligence but no consciousness.
The amoeba is not only less conscious than a human, it is also less
intelligent. Do you think it is just a coincidence that intelligence
and consciousness seem to be directly proportional?
A neuron is not essentially different from an amoeba, except in the
fact that it cooperates with other neurons to process information that
the individual neuron does not understand (rather like the man in the
Chinese Room). It is this activity which gives rise to intelligence
and consciousness, not anything to do with the biology of the neuron
itself. The biology of the neuron is akin to the workings of an
internal combustion engine in a car: it is essential to make the car
go and any significant problem with it will make the car stop, but if
you replaced the whole thing with an electric motor and battery system
of similar characteristics the car would go just as well.
More information about the extropy-chat