[ExI] Eliezer S. Yudkowsky, Singularitarian Principles. Update?

John Grigg possiblepaths2050 at gmail.com
Sat Nov 13 22:10:42 UTC 2010


Aleksei wrote:
http://singinst.org/grants/challenge

The goal of $200k was fully reached, and as far as I am aware, Peter
Thiel wasn't involved. (Though I can't rule out him being involved
with a moderate amount in this as well.)
>>>

I am sort of impressed by their list of projects.  But it looks like
the real goal is not really AI research, but instead building up an
organization to host conferences and better market themselves to
academia and the general public.  In that sense, Eliezer seems to be
doing very well. lol  And I noticed he did "friendly AI research" with
a grad student, and not a fully credentialed academic or researcher.

>From the SI website:
Recent AchievementsWe have put together a document to inform
supporters on our 2009 achievements. The bullet point version:

Singularity Summit 2009, which received extensive media coverage and
positive reviews.

The hiring of new employees: President Michael Vassar, Research
Fellows Anna Salamon and Steve Rayhawk, Media Director Michael
Anissimov, and Chief Compliance Officer Amy Willey.

Founding of the Visiting Fellows Program, which hosted 14 researchers
during the Summer and is continuing to host Visiting Fellows on a
rolling basis, including graduate students and degree-holders from
Stanford, Yale, Harvard, Cambridge, and Carnegie Mellon.

Nine presentations and papers given by SIAI researchers across four
conferences, including the European Conference on Computing and
Philosophy, the Asia-Pacific Conference on Computing and Philosophy, a
Santa Fe Institute conference on forecasting, and the Singularity
Summit.

The founding of the Less Wrong web community, to "systematically
improve on the art, craft, and science of human rationality" and
provide a discussion forum for topics important to our mission. Some
of the decision theory ideas generated by participants in this
community are being written up for academic publication in 2010.

Research Fellow Eliezer Yudkowsky finished his posting sequences at
Less Wrong. Yudkowsky used the blogging format to write the
substantive content of a book on rationality and to communicate to
non-experts the kinds of concepts needed to think about intelligence
as a natural process. Yudkowsky is now converting his blog sequences
into the planned rationality book, which he hopes will help attract
and inspire talented new allies in the effort to reduce risk.

Throughout the Summer, Eliezer Yudkowsky engaged in Friendly AI
research with Marcello Herreshoff, a Stanford mathematics student who
previously spent his gap year as a Research Associate for the
Singularity Institute.

In December, a subset of SIAI researchers and volunteers finished
improving The Uncertain Future web application to officially announce
it as a beta version. The Uncertain Future represents a kind of
futurism that has yet to be applied to Artificial Intelligence —
futurism with heavy-tailed, high-dimensional probability
distributions.
>>>



On 11/13/10, Aleksei Riikonen <aleksei at iki.fi> wrote:
> On Sat, Nov 13, 2010 at 3:16 PM, John Grigg <possiblepaths2050 at gmail.com>
> wrote:
>>
>> I realize he found a wealthy patron with Peter Thiel, and so money has
>> been given to the Singularity Institute to keep it afloat.  They have
>> had some nice looking conferences (I have never attended one), but I
>> am still not sure to what extent Thiel has donated money to SI or for
>> how long he will continue to do it.  I'd like to think that it's
>> enough money that Eliezer and Michael Anissimov can live comfortably.
>
> SIAI is not dependent on Peter Thiel for money (though it's very nice
> he has been a major contributor). For example, here is the page for
> the last fundraising sprint:
>
> http://singinst.org/grants/challenge
>
> The goal of $200k was fully reached, and as far as I am aware, Peter
> Thiel wasn't involved. (Though I can't rule out him being involved
> with a moderate amount in this as well.)
>
> --
> Aleksei Riikonen - http://www.iki.fi/aleksei
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>




More information about the extropy-chat mailing list