[ExI] Eliezer S. Yudkowsky, Singularitarian Principles. Update?

Tomaz Kristan protokol2020 at gmail.com
Sat Nov 13 19:30:34 UTC 2010


I have never met Yudkowsky in person, but I had an about 7 hours long
internet chat with him, back in 2000 or 2001, can't say exactly. It was an
interesting debate, although nothing groundbreaking. I have asked him what
is going on with the seed AI. He said that there is no seed AI yet. I've
asked him what about a seed for a seed AI, at least.He said that it would be
the same thing, so nothing is yet working, obviously. I claimed, that we
could *evolve* everything we want, intelligence if need be. He said it would
be catastrophic. I said not necessary, depends of what you want to be
evolved. An automatic factory for cars could be evolved, had been enough
computer power. He said it would be very prohibitively expensive in CPU time
to evolve every atom's right place. I said it needn't be that precise for
the majority of atoms. He said that this is an example of wishful thinking.

Later in a talk I mentioned, that the Drexel's molecule bearings are not
more than a concept. He insisted, that professor Drexel surely knew what he
was talking about. And so on, for 7 hours.

Since then, I had some short encounters with him and he was not even that
pleasant anymore. Tried to patronized me at the best, but I am used to this
attitude from many transhumanists and don't care much.

I have expected, that SIAI would come up with some AI design over these past
years, but they haven't and I don't think that they ever will.

He is like many others from this circle. Eloquent enough and very bright,
but a zero factor in practice. Non players, really.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20101113/aa562c1a/attachment.html>


More information about the extropy-chat mailing list