[ExI] Yes, the Singularity is the greatest threat to humanity
Stefano Vaj
stefano.vaj at gmail.com
Tue Jan 18 16:11:16 UTC 2011
On 18 January 2011 00:06, Anders Sandberg <anders at aleph.se> wrote:
> There seem to exist pretty firm limits to human cognition such as working
> memory (limited number of chunks, ~3-5) or complexity of predicates that can
> be learned from examples (3-4 logical connectives mark the limit). These
> limits do not seem to be very fundamental to all computation in the world,
> merely due to the particular make of human brains. Yet an entity that was
> not as bound by these would be significantly smarter than us.
Simply a matter of performances. If a man with a piece of paper can
operate a cellular automaton, and a cellular automaton can
demonstrably perform any kind of computation at all, a man with a
piece of paper can do whatever any non-quantum computer can do, given
enough time (and paper).
So, yes, we could have human emulations running faster. I am inclined
to postulate that they are externally indistinguishable, however, from
a man with a powerful enough computer at his fingertips.
--
Stefano Vaj
More information about the extropy-chat
mailing list