[ExI] The second step towards immortality

Anders Sandberg anders at aleph.se
Thu Jan 9 22:59:32 UTC 2014


On 09/01/2014 21:18, Ben wrote:
> To me, this sounds analogous to "one day, someone will get so good at
> simulating music using digital code that it could convince some people
> that it really /is/ music (even if the writers know that it is just a
> pile of clever code)!!.  What a jape!"
>
> Something about ducks sounding and looking like ducks comes to mind.
>
> (IOW, an emulation of an information process is an information process)

Yes, but can you tell it apart from a simulation?

I can construct a function f(x,y) that produces x+y for a lot of values 
you care to test, but actually *isn't* x+y [*]. Without looking under 
the hood they cannot be told apart. Same thing for any information process.

If what you care about is the stuff coming out of the black box, then 
what matters is whether there are any relevant differences between the 
output and what it should be. But sometimes we care about how the stuff 
is made. The two black boxes promising to multiply numbers using a cruel 
child labour implementation of the Chinese Room are not morally 
equivalent if one cheats by just having a microprocessor instead of 
little orphans. Even most Strong AI proponents [**] think that a Turing 
test succeeding stimulus-response lookup table is not conscious nor 
intelligent, despite being (by definition) able to convince the 
interlocutor indefinitely.


[*] Trivial example (let's ignore precision issues):
int f(int x, int y)
{
   if (x==34083480008589 & y==2389393939393473)
     return 0;
   return x+y;
}

Nontrivial example where *nobody* currently knows if it actually 
calculates addition of positive numbers:
int f(int x, int y)
{
   int z=x+y;
   int w=z;
   while (w!=1)
     if (w%2==0)
       w/=2;
     else
       w=w*3+1;
   return z;
}


[**] I admit, I am not entirely sure anymore. I thought it was obvious, 
but David Chalmers made me doubt whether causal relatedness is actually 
necessary for consciousness or not. If it isn't, then lookup tables 
might be conscious after a fashion. Or the sum total consciousness 
expressed by all possible interactions with the table already exists or 
existed when it was calculated.

-- 
Anders Sandberg,
Future of Humanity Institute
Oxford Martin School
Faculty of Philosophy
Oxford University



More information about the extropy-chat mailing list