[extropy-chat] SIAI seeking seed AI programmer candidates

Michael Anissimov michael at acceleratingfuture.com
Wed Jun 2 03:29:30 UTC 2004


Adrian Tymes wrote:

>Perhaps.  Setting humility aside for the moment, most
>of my past employers told me I was far and away the
>best programmer they had ever worked with.  The ones
>who didn't were employing me for non-programming
>skills.  I've gotten over $100K salary at previous
>jobs, and even with personal feelings towards this
>project letting me discount my rate, I'd demand a high
>pay rate so I can set aside all material worries
>(first and foremost, what happens to me - and my other
>financial concerns - if the project goes nowhere
>

If no FAI project ever goes anywhere, then someone eventually builds a 
self-improving UFAI (or engages in nanowar), and you, your financial 
concerns, and your personal concerns all go *poof* in one fell swoop. 

>...but this answer simply does NOT suffice.  I suspect
>the same is true for any programmer of my caliber or
>higher, which only reinforces things.  (Creating
>Friendly AI by myself?  Unlikely, at best.  Creating
>Friendly AI with a lot of help of my caliber?  That's
>starting to become possible.)
>

That's why we're seeking *Singularitarian* 
(http://yudkowsky.net/sing/principles.html) programmers of extremely 
high caliber.  They would be in it for the saving-the-world aspect.  A 
successful Singularity would bring about an immense amount of material 
abundance.

>Transforming the Earth the right way, even if one dies
>in the process, but is at least certain the project
>will reach its desired end - that can be managed.
>Risking everything when there's a good chance of
>nothing?  Sorry, but the same judgement you call for
>to guide the project through, tells me there are far
>less risky (personally and for all of humanity) paths
>to reach the same end of a Friendly AI, and they don't
>(presently) involve me dedicating my working hours to
>non-payers like you.
>

Where *do* they involve dedicating your hours?  Are there faster tracks 
to Friendly AI than the one SIAI is currently on?  SIAI was formed with 
the specific goal of getting to FAI as fast as possible, so it is our 
responsibility to seek out faster means if they are available.  What are 
your ideas?  Incremental bootstrapping with commercial approaches sounds 
more credible on the surface, but is isn't likely to get us to FAI 
faster than an exclusive focus, if that's what you were thinking of.

-- 
Michael Anissimov                           http://www.singinst.org/
Advocacy Director, Singularity Institute for Artificial Intelligence

--
Subscribe to our free eBulletin for research and community news:
http://www.singinst.org/news/subscribe.html





More information about the extropy-chat mailing list