[ExI] Eliezer S. Yudkowsky, Singularitarian Principles. Update?
natasha at natasha.cc
natasha at natasha.cc
Fri Nov 12 23:30:20 UTC 2010
This is an interesting dialogue.
I suppose most interesting is the way the Singularity has been obfuscated.
Eliezer's interest is FAI, where he has developed a theoretical
approach to the topic of superintelligence. His expertise is related
to FAI. Eli was interested in seed AI, as I recall it.
And as far as the Singularity goes, the early experts are Good, Vinge,
Broderick and Kurzweil. Since AI, AGI, and FAI are variables of the
Singularity, Eli applied this framework to his theory on seed AI and
FAI.
Eli aligns with Bostrom and Hanson. This is very fortunate for him in
light of his nonacademic standing. Regardless, Eli is a delightful
speaker. I dont' know the value of his work other than being
theoretical and stimulating.
Natasha
Quoting Aleksei Riikonen <aleksei at iki.fi>:
> On Fri, Nov 12, 2010 at 6:03 PM, Richard Loosemore
> <rpwl at lightlink.com> wrote:
>> Singularity Utopia wrote:
>>>
>>> Thanks Richard Loosemore, regarding the SL4 route to contact Eliezer,
>>> that's exactly the info I needed.
>>>
>>> John Grigg, you say I may not be allowed to stay long on the SL4 list? Why
>>> is this, are Singularitarians an intolerant group leaning towards fascism?
>>
>> Er.... you may be misunderstanding the situation. ;-)
>>
>> You will be unwelcome and untolerated on SL4, because:
>>
>> a) The singularity is, for Eliezer, a power struggle. It is a matter of
>> which personality "owns" these ideas .... who determines the agenda, who is
>> seen as the pre-eminent power broker .... who has the largest army of
>> volunteers to spread the message. And in that situation, you, my friend,
>> are a Threat. Even if your ideas were more sensible than his you would be
>> attacked and denounced, for the simple reason that you would not be meekly
>> conforming to the standard view of the singularity (as defined by The Wise
>> One).
>
> Might as well comment on Loosemore's mudslingings for a change...
>
> Richard Loosemore is himself one of the very few people who have ever
> been kicked out from SL4 (the vast majority of people who strongly
> disagree with e.g. Eliezer of course haven't been kicked out), and
> ever since he has been talking nasty about Eliezer.
>
> Apparently Loosemore's beliefs now include e.g. that the person
> calling himself "Singularity Utopia" would be felt by Eliezer to be a
> threat :) In light of such statements, I invite people to make their
> own judgements on how clearheaded Loosemore manages to be when
> commenting on Eliezer.
>
>
> To Singularity Utopia: You are free to join SL4, as everyone is
> (though that list indeed isn't used much these days). But I'm quite
> certain joining will not result in you successfully managing to
> contact Eliezer, and it is *not* appropriate to join just for that
> reason; that would be abuse of the list (even though the contact
> attempt would likely fail).
>
> As Eliezer notes on his homepages that you have read, the primary way
> to contact him is email. It's just that he gets so much email,
> including from a large number of crazy people, that he of course
> doesn't answer them all. (You, unfortunately, are one of those crazy
> people who pretty surely will be ignored. So in the end, on this
> matter it would be appropriate of you to accept that -- like all
> people -- Eliezer should have the right to choose who he spends his
> time talking to, and that he most likely would not want to correspond
> with you.)
>
> --
> Aleksei Riikonen - http://www.iki.fi/aleksei
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
More information about the extropy-chat
mailing list