[extropy-chat] Engineered Religion
Eliezer S. Yudkowsky
sentience at pobox.com
Sun Mar 20 02:30:15 UTC 2005
john-c-wright at sff.net wrote:
>
> The nanotechnology and superintelligent Jupiter-brains might also escape the
> control of their creators. Indeed, the whole transhumanist effort seems to be
> based on the idea that, as the Singularity approaches, it will slip from human
> control into the hands of a child-race of ours, astrononmically smarter than man.
Different transhumanists have different ideas about this. Certainly
that was my plan, once upon a time, when superintelligence was a
separate and mysterious magisterium to me. Surely I do not believe that
humans will still walk the Earth a million years hence - one way or
another. But it is no longer my plan to personally and deliberately
carry out this transition, at least not directly. Speaking on behalf of
the human species, we are not ready to be a parent.
Astronomically smarter than human doesn't take a terribly huge amount of
improvement. I no longer think that artificial superintelligence needs
to be inscrutable.
> Like all good parents, we must instruct our children in the basic rules of
> morality, lest they become monsters and turn on us. My question then becomes:
> what religion do we teach the intelligent machines in the early days, before
> they are independent? Do we want them all to be athiests, impatient and
> uncomprehending of the spiritual life of man?
Yes, no, and no.
I'm not just talking about the need to build artificial
superintelligences that conform to the laws of probability theory. No
child of mine will ever cower before an imaginary God. It is beneath
the dignity of human beings and it is beneath the dignity of our
descendants. If the lightning is beautiful, then let us see the beauty
in electricity without need for thunder deities; for if we cannot learn
to take joy in the merely real, our lives will be empty indeed.
But I'm not going to try to hardcode that, not in a child nor in an AI.
As an atheist, I have a simple, matter-of-fact confidence that
religionists once had and relinquished long ago. I don't think I need
to load the dice for my answer to win. All I need is to set in motion
the dynamics that seek truth, i.e., some computable approximation of
Solomonoff induction. If there were the tiniest shred of truth to
religion, that would be enough to uncover it. If you have even a
droplet of honest belief left, not just empty excuses for a faith you
lost long ago, you will not ask me to load an AI's dice in favor of your
pet theory. Let the truth will out.
> We could make them open-minded agnostics, not believing in anything in
> particular, but this might make them prey to fads and lunacies. (No offense
> meant to respected agnostic brethren, but it is state of mind where the
> wondering of man finds no rest. Athiests, at least, are certain.)
Before I go around creating a child, I think I shall take a stab at
plain vanilla Bayesian superintelligence. I am not sure I would take
quite the same plain vanilla approach to creating a child, but then I'm
not ready to be a father. There is a proper order to the mastery of
adult arts. Before the creation of a child comes the casting of simpler
sorceries.
> My suggestion, of course, is to school them in a religion that preaches and
> practices charity to the poor, the kindness to the infirm and chivalry to the
> weak. That way, once they become our superiors, they will have a better nature
> to which to appeal.
Your well-meant suggestion is refused. When you yourself - someday, if
you live and we survive - acquire the knowledge necessary to create a
child of the human species, or even a Bayesian, you will understand.
You have not the knowledge, this day, to relate actions to their
consequences. Who are you to design another mind, when your own
thoughts remain mysteries to you?
--
Eliezer S. Yudkowsky http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence
More information about the extropy-chat
mailing list