[ExI] Eliezer S. Yudkowsky, Singularitarian Principles. Update?

Mike Dougherty msd001 at gmail.com
Sun Nov 14 00:59:46 UTC 2010


Are any individual egos particularly relevant to the big picture of
"Singularitarian Principles"?

So one will have a set of pet theories that are more or less wronger
than someone else's more or less wrong theories.  Until a
machine-hosted intelligence claims self awareness and proves it to us
better than any of us can currently prove our own awareness to each
other, it's a non-starter.

Considering what DIY Bio is up to these days and assuming privately
funded (and covertly funded) operations have already captured the most
interesting projects - maybe the old school AI bootstrap to
singularity is a ho-hum fixation?

er... maybe it isn't.  :)



More information about the extropy-chat mailing list