[ExI] Paul vs Eliezer

Darin Sunley dsunley at gmail.com
Tue Apr 5 02:02:18 UTC 2022


The problem with AI motivations isn't so much that we don't understand
evolved motivations - we understand them all too well. Well enough to know
that if you successfully re-implemented an evolved motivational stack in
nanotech-based hardware, it would be an existential threat to everything in
its future light cone.

The trick is figuring out how to build a motivational stack that is as
/unlike/ an evolved motivational stack as possible, while still being
capable of having motivations at all. That's the hard part.

On Mon, Apr 4, 2022 at 7:56 PM Brent Allsop via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> Motivation is like a bunch of qualitative springs.  Nature wants us to
> reproduce, so it wires us to be attracted to that, using spring like
> desires.  What is more important is what is truly better.  Survival is
> better than not surviving.  Eventually, we will be intelligent enough to
> not want sex all the time and then be able to reprogram these springs for
> what we want to want.  Then once we are able to cut the puppet springs, and
> become free in this way, When it comes time to take the garbage out, we
> will make it orgasmic, resulting in us finally getting it done, when it
> needs to be done.  Bad is, by definition, that which isn’t good for us, so
> once we (and AIs) can reprogram these springs to be what we want them to
> be, most problems like “depression, mania, compulsions,” will long have
> been overcome.  In case you can’t tell, I’m in Paul’s camp
> <https://canonizer.com/topic/16-Friendly-AI-Importance/3-Such-Concern-Is-Mistaken>,
> which continues to extend its consensus lead over Eliezer’s camp
> <https://canonizer.com/topic/16-Friendly-AI-Importance/9-FriendlyAIisSensible>
> .
>
> On Mon, Apr 4, 2022 at 6:59 PM Rafal Smigrodzki via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> I posted this comment on Astral Codex Ten, regarding the debate between
>> Paul Christiano and Eliezer Yudkowsky:
>>
>> I feel that both Paul and Eliezer are not devoting enough attention to
>> the technical issue of where does AI motivation come from. Our motivational
>> system evolved over millions of years of evolution and now its core tenet
>> of fitness maximization is being defeated by relatively trivial changes in
>> the environment, such as availability of porn, contraception and social
>> media. Where will the paperclip maximizer get the motivation to make
>> paperclips? The argument that we do not know how to assure "good" goal
>> system survives self-modification cuts two ways: While one way for the AI's
>> goal system to go haywire may involve eating the planet, most
>> self-modifications would presumably result in a pitiful mess, an AI that
>> couldn't be bothered to fight its way out of a wet paper bag. Complicated
>> systems, like the motivational systems of humans or AIs have many failure
>> modes, mostly of the pathetic kind (depression, mania, compulsions, or the
>> forever-blinking cursor, or the blue screen) and only occasionally dramatic
>> (a psychopath in control of the nuclear launch codes).
>>
>> AI alignment research might learn a lot from fizzled self-enhancing AIs,
>> maybe enough to prevent the coming of the Leviathan, if we are lucky.
>>
>> It would be nice to be able to work out the complete theory of AI
>> motivation before the FOOM but I doubt it will happen. In practice, AI
>> researchers should devote a lot of attention to analyzing the details of AI
>> motivation at the already existing levels, and some tinkering might help us
>> muddle through.
>>
>> --
>> Rafal Smigrodzki, MD-PhD
>> Schuyler Biotech PLLC
>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20220404/305346f0/attachment-0001.htm>


More information about the extropy-chat mailing list