[ExI] Yes, the Singularity is the greatest threat to humanity

Anders Sandberg anders at aleph.se
Mon Jan 17 23:06:31 UTC 2011


Stefano Vaj wrote:
>
> On the other hand, given Wolfram's Principle of Computation
> Equivalence, which I have always found pretty persuasive, there are
> not things more intelligent than others once the very low level of
> complexity required for exhibiting universal computing features is
> reached. There are just things that execute different programs with
> different performances.
>   

Hmm, as a neuroscientist I think the computational complexity of a 
chimpanzee brain and a human brain are essentially identical. Yet I 
think we both agree that humans can think and do things chimps cannot 
possibly come up with, no matter the amount of time.

There seem to exist pretty firm limits to human cognition such as 
working memory (limited number of chunks, ~3-5) or complexity of 
predicates that can be learned from examples (3-4 logical connectives 
mark the limit). These limits do not seem to be very fundamental to all 
computation in the world, merely due to the particular make of human 
brains. Yet an entity that was not as bound by these would be 
significantly smarter than us.

It is easy to train a neural network to recognize clusters in a 
50-dimensional space from a few examples, yet humans cannot do this in 
general. I have a hard time thinking this is just because we have 
different performance: we are computational systems with very different 
intrinsic biases and learning abilities.

-- 
Anders Sandberg,
Future of Humanity Institute
Philosophy Faculty of Oxford University 




More information about the extropy-chat mailing list