[extropy-chat] Fools building AIs

Mike Dougherty msd001 at gmail.com
Fri Oct 6 03:25:11 UTC 2006

On 10/5/06, Eliezer S. Yudkowsky <sentience at pobox.com> wrote:
> Highly skilled rationalists who understand intelligence are going to be
> on guard against:
> Separate magisteria;
> Extensional neglect;
> Scope neglect;
> Inconsistent evaluations of different verbal descriptions of the same
> events;
> not to mention,
> Failure to search for better third alternatives;
> Fatuous philosophy that sounds like deep wisdom; and
> Self-destructive impulses.
I do not claim to be a highly skilled rationalist.  So I will only ask a

What if the superior processing power was something similar to the Star Trek
Borg?   I know, this is "science [sic] fiction" but the concept is akin to
forced upload into a Matrioshka Brain.  Would we collectively be "better
off" in terms of escaping the zero-sum life experience?  Each individual in
the collective could feasibly believe they were at the top of the
hierarchy.   Rather than fighting amongst ourselves over the limited
biochemicals contained on this rock called Earth, we could simulate the
misery of earthly existence for the nostalgic masochists who refuse to move
on.  I know the 'Borg' were depicted as horrible enemies of humanity - but
once you get over the serialization-upload-deserialization procedure, "life"
as the uploaded are concerned could be at least equal if not infinitely

If this is an amusing enough thought to point out which of the above
enumerated rational failures were employed, I would actually appreciate the
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20061005/516eb7ab/attachment.html>

More information about the extropy-chat mailing list