[ExI] digital simulations, descriptions and copies

Stathis Papaioannou stathisp at gmail.com
Wed Jan 20 13:45:07 UTC 2010


2010/1/20 Gordon Swobe <gts_2000 at yahoo.com>:
> --- On Tue, 1/19/10, Stathis Papaioannou <stathisp at gmail.com> wrote:
>
>> The "matter thinks" theory of mind can't explain where how
>> people understand words either.
>
> I don't consider myself in possession of any "theory". I just observe that the brain as it exists in nature does not seem to work like a digital computer.
>
> We notice a phenomenon and we try to put together hypotheses to explain it. If a given hypothesis fails, we toss it out and keep trying.
>
> In this case we notice the phenomenon of consciousness/semantics -- the conscious understanding of words.
>
> Hoping to explain the conscious understanding of words and other puzzles, some clever techno-geeks with just enough knowledge of philosophy to be dangerous put together the so-called "computationalist theory of mind". It seemed like a great idea at the time.
>
> The running of mental programs might explain the method by which can think but it fails to explain how we *know* about our own thoughts. The brain must then do something else besides run programs. So I toss that hypothesis out as incomplete or wrong and keep trying.
>
> Philosophically, to resolve the consciousness/semantics problem the computationalist theory must commit the homunculus fallacy. If the brain equals a digital computer and if the mind equals a program running on that computer then there must exist a homunculus to operate and observe that computer. So the theory fails.

The brain is not organised along the lines of a digital computer, but
it is organised along the lines of an information processing system
that might evolve naturally, which is copied in artificial neural
networks. That's the important part of it, as far as evolution is
concerned; consciousness, such as it is, is a side-effect. In any
case, you have not presented *any* theory as to what consciousness may
be due to if not as a side-effect of information processing. If the
NCC is a particular sequence of chemical reactions, why should that be
an "explanation"? You could argue that there is no understanding in
chemical reactions, it's totally ridiculous, and therefore there must
be an imaterial soul of which the chemical reaction is just a physical
manifestation. And you could make this argument for *any* physicalist
theory, just as you make it for computation.

These arguments are mere speculation with no logical or empirical
force; it's just an opinion that matter or physical activity or matter
specifically engaged in computation cannot give rise to consciousness.
But the proof that if consciousness is separable from intelligence
then you are forced to an absurd position on what consciousness is
still stands, regardless of what theory eventually turns out to be
right.


-- 
Stathis Papaioannou



More information about the extropy-chat mailing list