[extropy-chat] FAI: Collective Volition

Eliezer Yudkowsky sentience at pobox.com
Mon May 31 10:47:55 UTC 2004


An update to that part of Friendly AI theory that describes Friendliness, 
the objective or thing-we're-trying-to-do.  Those of you who have 
complained about insufficient specification will now have many other things 
to complain about instead.

http://www.sl4.org/bin/wiki.pl?CollectiveVolition

Comments to:

http://www.sl4.org/bin/wiki.pl?CommentaryOnCollectiveVolition

-- 
Eliezer S. Yudkowsky                          http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



More information about the extropy-chat mailing list