[ExI] A Vindication of the Rights of Machines

ablainey at aol.com ablainey at aol.com
Thu Feb 14 19:13:22 UTC 2013




-----Original Message-----
From: Anders Sandberg <anders at aleph.se>
Sent: Thu, 14 Feb 2013 18:22

>You can also have analogies. In my upcoming paper on upload ethics I    argue that emulations of animals >should be treated as if they had    the same moral standing as the animal unless we can prove that the    >emulation lack the relevant properties for having moral patienthood.    But this is because they are analogous >to the original. If the AI is    something unique we have a harder time figuring out its moral    status.
Similarly what of the moral status of an incomplete or deficientley uploaded human, do we afford them equal rights to the original? Personally I am tempted to put them in the same unknown moral category as AI.
    >Legal persons are however not moral persons. Nobody says that it is    wrong for the government to dissolve >or split a company, despite the    misgivings we have about capital punishment. Same thing for legal    rights: >ideally they should track moral rights, but it is a bit    random.

          >>It wouldn't be the same as giving it human rights as companies still don't                have such rights, but in many >>places they                  can vote.
    >Where else but the City of London?
I was under the impression that corporations had limited voting rights in some places in the US? Perhaps a misremembering or misinterpretation on my part. In any case its interesting that a right so dearly held such as voting can be given to a non living thing. While that thing lacks other basic human rights.
    >This is a real problem. If there is nothing like punishment, there    might not be any real moral learning. You >can have a learning    machine that gets negative reinforcement and *behaves* right due to    this, but it is just >like a trained animal. The interesting thing is    that the negative reinforcement doesn't have to be a >punishment by our standards, just an error signal.
Perhaps. I personally have "Brain Hurt" with this area. I can only equate it to the issue of needing to replicate the Human chemistry in uploads or all our error signals such as pain, remorse, jealousy will be lost. To me I cant help but think a simple error signal to a machine is as meaningless as showing a red card to a sportsman who doesn't know what it means. It is only a symbol, the actual punishment only comes from the chemistry it evokes. If we give a machine a symbol of pain, that really won't cut it imho. 
In the same way that if I were immortal and jumped off a building, seeing you hand me a piece of card that says "broken legs" isnt going to stop me walking away. LOL. Even if I do know the rules of the game.
Its a tough one. I think it would depend on what the currency of the machine is (in Dr Phil speak), what the machine holds as valuable. Then using that as leverage for good behavior.
"Bad machine!.....no data for you!"

    >Moral proxies can also misbehave: I tell my device to do A, but it    does B. This can be because I failed at >programming it properly, but    it can also be because I did not foresee the consequences of my    instructions. >Or the interaction between the instructions and the    environment. My responsibility is 1) due to how much >causal control    I have over the consequences, and 2) how much I allow consequences    to ensue outside my >causal control. 
A problem that already exists. I have wondered about the implications of automatic parking available in some cars. Should you engage this parking system and your car *decides* to prang the parked vehicle in front, who is responsible? The car for a bad judgement, You for not applying the breaks, the engineer who designed the physical mechanisms, the software developer or the salesman who told you it was infallible?
I think as such autonomous systems evolve there should and hopefully will be a corresponding evolution of law brought about by liability suits. Im not aware of any yet, but im 100% sure they will appear if they haven't already. Perhaps the stakes are not yet high enough with simple parking mishaps, but when the first self driving car ploughs through a bus stop full of Nuns, the lawyers will no doubt wrestle it out for us. 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20130214/e787701f/attachment.html>

More information about the extropy-chat mailing list