[ExI] AGI Motivation revisited [WAS Re: Isn't Bostrom seriously ...]
stefano.vaj at gmail.com
Thu Jun 23 13:35:16 UTC 2011
On 22 June 2011 21:11, Richard Loosemore <rpwl at lightlink.com> wrote:
> A "motivation mechanism" is something that an ordinary PC does not even
> have, so I cannot for the life of me make sense of your first sentence.
Damien is right in suggesting that this can be considered as a reduction ad
absurdum, but my point is that either the entity we develop really emulates
animal behaviour, or the very meaning of the word "motivation" needs to be
enlarged to radically metaphorical projections of our own internal status
which end up being equally applicable to any universal computing device.
Or perhaps even any natural phenomenon...
As to the Principle of Computational Difference, the sense in which it may
be relevant to this discussion is the following: all universal computing
devices, biological ones included are all and the same in terms of what they
can in principle do. Speaking of their relative "intelligence" one vis-a-vis
anotehr has a sense has only two rigorous senses:
- the first, is their relative performance in a given task (so that in this
sense a computer can be arbitrarily "intelligent" without exhibiting any
- the second, their possible executing an "animal", that is Darwinian,
program (so that a in this sense a "friendly-by-definition" device, whatever
this may mean, would not be recognised as intelligent nor would ever pass a
> As to Turing-passing beings, that is beings which can be performant or not
>> in the task but can behaviourally emulate specific or generic human beings,
>> you may have a point that either they do it, and as a consequence cannot be
>> either better or worse than what the emulate, or they do not (and in that
>> event will not be recognisable as "intelligent" in any anthropomorphic
>> As to empathy to the "human race" (!), I personally do no really feel
>> anything like that, but I do not consider myself more psychotic than
>> average, so I am not inclined to consider seriously any such rhetoric.
> Rhetoric? It is not rhetoric. If you are not psychotic (and I have no
> reason to believe that you are), then you already have some empathy for your
> species, whether you are introspectively aware of it or not.
Please believe that I do not, not any more than you can automatically have
for, say, your race as such, whatever it may be.
I have empathy for actual beings, which for that matter may or may not
belong to my family, race or species.
> Sure, you may well hard-code in a computer behaviours aimed at protecting
>> such a dubious entity, and if this work to operate the power grid you will
>> end up without electricity the first time you have to perform an abortion.
>> Do we really need that?
> What?! I am sorry, but you will have to clarify your train of thought for
> me, because I can make no sense of this.
"Friendliness for the Man" is a cultural construct which can easily be
analysed as a hypostasis of judeo-christian ethical concepts which do not
bear closer inspection when they are reduced and secularised to "scientific"
concept such as the species.
A phoetus, in my example, certainly belong to the species, and thus a
friendly AGI operating the grid should refuse electricity to any operating
room where an abortion were to be performed.
But more to the point, the paradoxes of such concepts when applied to AGIs
are illustrated inter alia in the fictionalisation offered by Jack
Williamson's cycle of the Humanoids, where the only consistent behaviour for
machines acting in strict compliance with Asimov's Laws is to strip human
beings of anything "human" they may have.
Now, I do not really see why we should bother in creating lobotomised
children of the mind, at risk of having them lobotomise ourselves. If this
is what is suggested, I would simply drop the effort of developing
anthromorphic behaviours on silicon, and be contented with increasingly
powerful computers and "ordinary", biological, albeit perhaps genetically
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat