[ExI] Meaningless Symbols.

Stathis Papaioannou stathisp at gmail.com
Sun Jan 17 04:00:23 UTC 2010


2010/1/17 Gordon Swobe <gts_2000 at yahoo.com>:
> --- On Sat, 1/16/10, Eric Messick <eric at m056832107.syzygy.com> wrote:
>
>> Would you say that "description != thing" is the reason
>> computer systems cannot replicate the capability of brains to
>> understand?
>
> In a general sense, yes. I think S/H systems can simulate understanding as in weak AI but that they cannot have conscious understanding as in strong AI.
>
> And conscious understanding affects behavior including the behavior of those neurons associated with understanding, making weak AI itself a formidable challenge.

Let's look at what it means to say that conscious understanding
affects behaviour. Suppose you are deciding between buying full cream
or low fat milk. There are many considerations: the possible
differential effects on your health, what you will do with the milk
and the likely difference in taste according to application, the
preferences of anyone else you are going to share the milk with, and
so on. What seems a trivial task requires quite a deep understanding
of the world with analysis of your own preferences and the possible
consequences of your behaviour. But if I look inside your head I don't
see any of this internal debate. What I see is a complex dance of
electrical impulses across neurons, following mechanistic rules, and
resulting in a signal sent to the muscles in your arm so that you
reach out and pick the full cream milk. Did your understanding affect
your decision? In one sense, obviously yes; but from the external
observer's point of view your brain was just doing what it had to
blindly following the laws of physics. If your understanding had been
different your decision may have been different, but your
understanding could *only* have been different if the physical
activity in your brain had been different, and a full account of the
physical effects is enough to explain the behaviour without reference
to consciousness. This makes consciousness epiphenomenal.

The same is true of a computer program. The computer does what it does
because its components follow the laws of physics. If the program were
different the computer would have behaved differently, but the program
could not have been different unless the configuration of the computer
were different. The program is just a mental aid for the programmer to
set up the computer to behave in a particular way, and has no separate
causal potency of its own.


-- 
Stathis Papaioannou



More information about the extropy-chat mailing list