[ExI] Unfriendly AI is a mistaken idea
Russell Wallace
russell.wallace at gmail.com
Sun May 27 19:17:35 UTC 2007
On 5/27/07, Eugen Leitl <eugen at leitl.org> wrote:
>
> On Sun, May 27, 2007 at 01:12:50AM +0100, Russell Wallace wrote:
> > Name: Tools are neither friendly nor unfriendly
>
> If tools are persons, yes, they are.
In reality however, for better or worse, tools are not persons, nor is there
any prospect of it being feasible to create a person from scratch.
If a piano falls on the top of your head, you're still dead.
> And no malice intended.
In practice, though, I don't actually spend an awful lot of time worrying
about the prospect of a piano jumping on my head and killing me, let alone a
race of unfriendly pianos creating an existential disaster by jumping on
everyone's head and killing them. That's because jumping on people and
killing them is a complex behavior that doesn't happen by accident. If I was
walking through the jungle and I came across a tiger, I would be wary of it
jumping on me and killing me, but that's because its ancestors have been
selected for that behavior for the last yea million years. The ancestors of
pianos haven't.
Pollution is a killer feature of systems, produced by human engineers.
>
Every system, whether produced by human engineers, evolution or any other
source, must produce pollution directly or indirectly, according to the laws
of thermodynamics. There is therefore no question that an AI will produce
pollution (though happily the amount produced per kilowatt, gigaflop or
other unit of output is going down - in the Lamarckian evolutionary
environment of a man-made technosphere, pollution is maladaptive). It
doesn't follow that an AI will conquer the world. Conquering the world is
complex behavior that doesn't happen by accident.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20070527/ffe7952a/attachment.html>
More information about the extropy-chat
mailing list