[ExI] Self-Driving Cars Must Make Ethical Decisions

BillK pharos at gmail.com
Wed Jul 29 09:33:55 UTC 2015

This is one for Anders!  :)



Researchers are trying to program self-driving cars to make
split-second decisions that raise real ethical questions.
By Will Knight on July 29, 2015

At a recent industry event, Gerdes gave an example of one such
scenario: a child suddenly dashing into the road, forcing the
self-driving car to choose between hitting the child or swerving into
an oncoming van.

“As we see this with human eyes, one of these obstacles has a lot more
value than the other,” Gerdes said. “What is the car’s

Gerdes pointed out that it might even be ethically preferable to put
the passengers of the self-driving car at risk. “If that would avoid
the child, if it would save the child’s life, could we injure the
occupant of the vehicle? These are very tough decisions that those
that design control algorithms for automated vehicles face every day,”
he said.

Walker-Smith adds that, given the number of fatal traffic accidents
that involve human error today, it could be considered unethical to
introduce self-driving technology too slowly. “The biggest ethical
question is how quickly we move. We have a technology that potentially
could save a lot of people, but is going to be imperfect and is going
to kill.”
End quotes   -------------

I think they might also need an ethics options setup screen for the
driver. The ethics of the manufacturer might not agree with the ethics
of the driver. Should we make a national set of car driving ethics
compulsory? Will different cars have different ethics? This could
generate some interesting features to attract sales.

There is also a chance that hackers will be able to change the ethics
section of the car computer. All the way from 'Protect me at all
costs' to weaponize it into 'Kill as many as possible'.


More information about the extropy-chat mailing list