[ExI] Eliezer Yudkowsky New Interview - 20 Feb 2023

Dave S snapbag at proton.me
Mon Feb 27 23:02:52 UTC 2023


On Monday, February 27th, 2023 at 12:48 PM, Gadersd via extropy-chat <extropy-chat at lists.extropy.org> wrote:

> Please note the term “well-defined.” It is easy to hand wave a goal that sounds right but rigorously codifying such as goal so that an AGI may be programmed to follow it has so far been intractable.

Why would you ask a super intelligent AI with solving goals rather than asking it how the goals could be achieved?

Why would you give a super intelligent AI the unchecked power to do potentially catastrophic things?

-Dave
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230227/59a4a279/attachment.htm>


More information about the extropy-chat mailing list