[ExI] Eliezer S. Yudkowsky, Singularitarian Principles. Update?
Samantha Atkins
sjatkins at mac.com
Mon Nov 15 04:48:18 UTC 2010
On Nov 12, 2010, at 2:44 PM, Aleksei Riikonen wrote:
> On Sat, Nov 13, 2010 at 12:33 AM, BillK <pharos at gmail.com> wrote:
>>
>> As I understand SU's request, she doesn't particularly want to enter a
>> dialogue with Eliezer. Her request was for an updated version of The
>> Singularitarian Principles
>> Version 1.0.2 01/01/2000 marked 'obsolete' on Eliezer's website.
>>
>> Perhaps someone could mention this to Eliezer or point her to more
>> up-to-date writing on that subject? Doesn't sound like an
>> unreasonable request to me.
>
> If people want a new version of Singularitarian Principles to exist,
> they can write one themselves.
Hardly. I cannot speak for this Institute. How would my writing such a thing be anything but my opinion? I want to know what the SIAI current positions are. What is it current formulation of what a FAI is and how it may be attained? What are its current definitions of Friendliness in hopefully implementable and testable terms? What sort of AGI or recursively optimizing procedure or whatever does it propose to create? What means does it advocate to avoid unfriendly AGI? Does it seek a singleton AGI (or equivalent) or peer AGIs and why?
> Eliezer has no magical authority on the
> topic, that would necessitate that it should be him. (Also, I doubt
> Eliezer thinks it important for a new version to exist.)
>
An organization that claims its sole purpose is the attainment of a safe and Friendly AGI driven singularity or to at least avoid UFAI is under no obligation to state what its current thinking and position is? If it does not then why would anyone take it seriously (at least in those stated goals) at all?
- s
More information about the extropy-chat
mailing list