[ExI] What might be enough for a friendly AI?

Stefano Vaj stefano.vaj at gmail.com
Thu Nov 18 14:38:28 UTC 2010


On 18 November 2010 00:21, spike <spike66 at att.net> wrote:
> It is even more complicated than that.  To hold this analogy, most farmers
> are truck drivers as well.  If we define a friendly AGI as one which does
> what we want, we must want what we want, and to do that we must know what we
> want.  Often, perhaps usually, this is not the case.

Very well said. Moreover, I assume most of us like to imagine
scenarios where human beings (and/or their more-or-less different
offspring) still are in a position to be wanting different things.

> An AGI which does what we want might be called a slave, but in the wrong
> hands it is a weapon.  Hell even In the right hands it is a weapon.

Yes, same as any computer. Or rather: same as any *machine*.

> Sure.  Time and nature will most likely slay you and me before an AGI does,
> but it isn't clear in the case of my son.

For sure, one son may kill another, a distinct possibility which
usually not even in China is however expounded as a ground for birth
control.

We are down to discuss who can be qualified as a "son", whether we
should realistically expect sons to organise themselves in factions
based on their hardware rather than any other conceivable factor, and
what ground may exist to prefer some sons over other...

I am not saying such issues are absurd. Only, I do not think they can
be ignored by a naive and fully implicit approach to their solution.

> An AGI that emerges later in
> history may do so under more advanced technological and ethical
> circumstances, so perhaps that one is human-safer than one which emerges
> earlier.  But perhaps not.  We could fill libraries with what we do not
> know.

What is really "human"? Why should we care about their safety?

Again, those are not rhetorical questions, implying that humans do not
exist or that we should not care. But the personal answers we give to
such questions must be consistent *and* determines we deal with the
AGI issue.

Or with alien visitors. Or with clones. Or with biological entities
engineered in radically different fashion. Or with other animals, for
that matter.

-- 
Stefano Vaj




More information about the extropy-chat mailing list