[ExI] Some new angle about AI

Lee Corbin lcorbin at rawbw.com
Sat Jan 2 19:22:45 UTC 2010


Gordon wrote:

> Stathis wrote:
> 
>> The only certain way to simulate a brain is to simulate the activity 
>> of neurons at the molecular level. 

I assume this means at the input/output level only;
that anything further would not add to the experience
being had by the entity.

> I agree with your general direction but I wonder how you
 > know we needn't simulate them at the atomic or subatomic
 > level. How do you know it's not turtles all the way down?

Let's suppose for a moment that Gordon is right. In other
words, internal mechanisms of the neuron must also be
simulated.

I want to step back and reexamine the reason that all of this
is important, and how our reasoning about it must be founded
on one axiom that is quite different from the other scientific
ones.

And that axiom is moral: if presented with two simulations
only one of which is a true emulation, and they're both
exhibiting behavior indicating extreme pain, we want to
focus all relief efforts only on the one. We really do
*not* care a bit about the other.

(Again, good philosophy is almost always prescriptive, or
entails prescriptive implications.)

For those of us who are functionalists (or, in my case, almost
100% functionalists), it seems almost inconceivable that the causal
components of an entity's having an experience require anything
beneath the neuron level. In fact, it's very likely that the
simulation of whole neuron tracks or bundles suffice.

But I have no way of going forward to address Gordon's
question. Logically, we have no way of knowing that in
order to emulate experience, you have to simulate every
single gluon, muon, quark, and  electron. However, we
can *never* in principle (so far as I can see) begin to
answer that question, because ultimately, all we'll
finally have to go on is behavior (with only a slight
glance at the insides).

I merely claim that if Gordon or anyone else who doubts
were to live 24/7 for years with an entity that acted
wholly and completely human, yet who was a known simulation
at, say, the neuron level, entirely composed of transistors
whose activity could be single-stepped through, then Gordon
or anyone else would soon apply the compassionate axiom,
and find himself or herself incapable of betraying or
inflicting pain on his or her new friend anymore than
upon a regular human.

Lee




More information about the extropy-chat mailing list