[ExI] The digital nature of brains (was: digital simulations)

Gordon Swobe gts_2000 at yahoo.com
Mon Feb 1 13:29:15 UTC 2010


--- On Mon, 2/1/10, Stathis Papaioannou <stathisp at gmail.com> wrote:

>> Right, and Stathis' subject will eventually pass the
> TT just as your subject will in your thought experiment. But
> in both cases the TT will give false positives. The subjects
> will have no real first-person conscious intentional
> states.
> 
> I think you have tried very hard to avoid discussing this
> rather simple thought experiment. It has one premise, call it P:

I didn't avoid anything. We went over it a million times. :)
 
> P: It is possible to make artificial neurons which behave
> like normal neurons in every way, but lack consciousness.

P = true if we define behavior as you've chosen to define it: the exchange of certain neurotransmitters into the synapses at certain times and other similar exchanges between neurons. I reject as absurd for example your theory that a brain the size of texas constructed of giant neurons made of beer cans and toilet paper will have consciousness merely by virtue of those beer cans squirting neurotransmitters betwixt themselves in the same patterns that natural neurons do. I also reject, in the first place, your implied assumption that the neuron is necessarily the atomic unit of the brain. 

> OK, assuming P is true, what happens to a person's
> behaviour and to his experiences if the neurons in a part of his 
> brain with an important role in consciousness are replaced with these
> artificial neurons?

As I explained many times, because your artificial neurons will not help the patient have complete subjective experience, and because experience affects behavior in healthy people, the surgeon will need to keep re-programming the artificial neurons and most likely replacing and reprogramming other neurons until finally at long last he creates a patient that passes the Turing test. But that patient will not have any better quality consciousness than he started with, and may become far worse off subjectively by the time the surgeon finishes, depending on facts about neuroscience that in 2010 nobody knows.

Eric offered a more straightforward experiment in which he simulated the entire brain. You complicate the matter by doing partial replacements, but the principles that drive my arguments remain the same: formal programs do not have or cause minds. If they did, the computer in front of you this very moment would have a mind and would perhaps be entitled to vote like other citizens.

-gts


      



More information about the extropy-chat mailing list