[ExI] Do robot cars need ethics as well?
atymes at gmail.com
Thu Nov 29 18:26:21 UTC 2012
> Your car is speeding along a bridge at fifty miles per hour when
> errant school bus carrying forty innocent children crosses its path.
> Should your car swerve, possibly risking the life of its owner (you),
> in order to save the children, or keep going, putting all forty kids
> at risk?
The correct decision, and the one taught by any state-licensed
driver's ed course in the US, is to maintain enough awareness and
reaction distance so that this never happens in the first place.
> Almost any easy solution that one might imagine leads to
> some variation or another on the Sorceror’s Apprentice, a genie that’s
> given us what we’ve asked for, rather than what we truly desire.
Yep, reality is complex. But one can encode the same rules
that most humans live by, to be safe enough.
On Thu, Nov 29, 2012 at 6:31 AM, BillK <pharos at gmail.com> wrote:
> Consider the bus crash example. What if it is a prison bus with
> convicted murderers in it? Does each human carry a value tag for the
> computer to use?
What if it's a school bus rented by a prison authority, or a prison
bus rented by a school authority (for a "Scare Them Straight"
program - whatever one might think of those), or if those murderers
had all been exonerated and were being bussed to an airport so
they could go home? Identification far beyond "a person" does not
seem reliable enough to base any judgments on.
More information about the extropy-chat