[ExI] Superrationality

Eliezer Yudkowsky sentience at pobox.com
Wed May 7 05:36:22 UTC 2008


On Tue, May 6, 2008 at 8:44 PM, Lee Corbin <lcorbin at rawbw.com> wrote:
>
>  Precisely. I would, for example, cooperate only with a "close duplicate",
>  since physically speaking what I do is what my close duplicate does to a
>  high degree of fidelity.
>
>  But I have stated the necessary and sufficient conditions above for
>  superrationality to obtain, and so lacking "close physical duplicates",
>  superrationality at the present time in human history is impossible.

Your condition is sufficient but not necessary.  It suffices for the
players to know each other's algorithms.

-- 
Eliezer Yudkowsky
Research Fellow, Singularity Institute for Artificial Intelligence



More information about the extropy-chat mailing list