[extropy-chat] Fools building AIs (was: Tyranny in place)

Eugen Leitl eugen at leitl.org
Fri Oct 6 08:23:59 UTC 2006

On Thu, Oct 05, 2006 at 09:14:33PM -0400, Ben Goertzel wrote:

> I asked a friend recently what he thought about the prospect of
> superhuman beings annihilating humans to make themselves more
> processing substrate...
> His response: "It's about time."

Not quite, I hope. I'd like my kids to grow up.
> When asked what he meant, he said simply, "We're not so great, and

Not so great in comparison to what?

> we've been dominating the planet long enough.  Let something better

Better in which sense? Fitter doesn't mean better, especially
if you're at the receiving end.

> have its turn."

I understand Moravec is of a similiar persuasion. 
Well, it's an opinion, and I'm glad people with that
opinion are not in charge of the world.
> I do not see why this attitude is inconsistent with a deep
> understanding of the nature of intelligence, and a profound
> rationality.

I don't know what profound rationality is, but your brand
of it seems to exclude such basic fare like self-preservation.

Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
ICBM: 48.07100, 11.36820            http://www.ativel.com
8B29F6BE: 099D 78BA 2FD3 B014 B08A  7779 75B0 2443 8B29 F6BE
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 191 bytes
Desc: Digital signature
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20061006/4da10bf9/attachment.bin>

More information about the extropy-chat mailing list