[ExI] What might be enough for a friendly AI?

Florent Berthet florent.berthet at gmail.com
Thu Nov 18 01:48:42 UTC 2010


2010/11/18 spike <spike66 at att.net>

> … *On Behalf Of *Florent Berthet
>

> >…I don't really mind dying if my successors (supersmart beings or
> whatever) can be hundreds of times happier than me…
>
> More generally, wouldn't it be a shame to prevent an AGI to create an
> advanced civilization (eg computronium based) just because this outcome
> could turn out to be less "friendly" to us than the one of a human-friendly
> AGI?  In the end, isn't the goal about maximizing collective happiness?
>
>
>
> Florent you are a perfect example of dangerous person to have on the AGI
> development team.  You (ad I too) might go down this perfectly logical line
> of reasoning, then decide to take it upon ourselves to release the AGI, in
> order to maximize happiness.
>

Do you know what the Singinst folks (who I support, by the way) think about
that ?




> >…So why don't we just figure out how to make the AGI understand the
> concept of happiness (which shouldn't be hard since we already understand
> it), and make it maximize it?
>
>
>
> Doh!  You were doing so well up to that point, then the fumble right at the
> goal line.  We don’t really understand happiness.  We know what makes us
> feel good, because we have endorphins.  An AGI would (probably) not have
> endorphins.  We don’t know if it would be happy or what would make it happy.
>
>
>
> spike
>
>
>

Yeah I was tempted to moderate this statement. What I meant was that we
although we don't fully grasp all the mechanisms of the feeling of
happiness, and we certainly don't know all the kinds of happiness that could
exist, we understand reasonably well what it means for somebody to be happy
or unhappy. An AGI should be able to get this, too, for it would understand
that we all seek this state of mind, and it would probably try to duplicate
the phenomenon on itself (which shouldn't be hard, because everything is
computable, the effects of endorphins included).
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20101118/6b806c29/attachment.html>


More information about the extropy-chat mailing list