[extropy-chat] Engineered Religion

Eliezer S. Yudkowsky sentience at pobox.com
Tue Mar 22 01:14:52 UTC 2005


justin corwin wrote:
> 
> All true statements. All very appealing (at least to this rationalist
> and this truthseeker). But, the subtle shift in conversation here is
> quite nearly unnoticed. We've transitioned to instilling beliefs in a
> mind, to better them and ourselves, to talking about the structure of
> the mind, to fixing it so there is only one answer. Perhaps because
> the theist is muddled in his thinking this blanket approach is valid.
> It's true that Eliezer's objections do entirely refute John C Wright's
> theistic aspirations. But his argument does not directly address his
> points.

Huh?  I addressed the two main points Wright had, as I saw them:

1)  Wright wants to program in religion as fixed.  I regard religion as 
possessing and relying on factual components which would be invalidated 
by a simple truthseeking dynamic.  Programming in fixed beliefs creates 
a conflict of interest over whose pet belief gets programmed; 
programming in a truthseeking dynamic without loaded dice seems to me a 
fair resolution.  C.f. http://sl4.org/wiki/CollectiveVolition, 
_Motivations_, "5. Avoid creating a motive for modern-day humans to 
fight over the initial dynamic."

2)  Wright has warm and fuzzy feelings about daddygods whose imaginary 
threat of eternal hell keeps people in line.  I reply with my warm and 
fuzzy feelings about free and independent humans, living without fear, 
moral because that is who they choose to be.

If you feel I failed to address a point, why do you not state clearly 
what it is?  Sheeze, and they call me Yoda.  I never signed up to be 
your Zen Master.  I just ordinarily talk like that, y'know, it's the way 
I express my philosophy of being human.

-- 
Eliezer S. Yudkowsky                          http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



More information about the extropy-chat mailing list