[ExI] Autonomous car ethics

Chris Hibbert hibbert at mydruthers.com
Sat Jun 25 17:16:45 UTC 2016


 > Should a self-driving car kill its passengers for the greater good ?
 > for instance, by swerving into a wall to avoid hitting a large number
 > of pedestrians?

This is entirely an academic philosophy question, and not at all a 
question that needs to be resolved before self-driving cars should be 
put on the road.

I'd be willing to bet that there aren't *any* incidents in the training 
database that Google's cars have collected from millions of miles of 
driving that show a situation in which the driver has to choose between 
injuring or killing the occupants and injuring or killing anyone outside 
the car. In addition, I'll add in all the miles driven by anyone on this 
list, and claim that it's really unlikely that any of *you* have 
encountered such a situation either. If drivers drive at sane speeds on 
roads that are navigable and have decent traction, you can slow down 
early enough that you won't come across such a situation.

I won't claim that such situations have *never* happened, but they're so 
rare that expecting humans to do the right thing with their slow 
reaction times is so unlikely that we have nothing to compare 
hypothetical autonomous cars to.

Chris
  --
Every machine that's put into a factory displaces labour. [...]  The
man who's put to work [on] the machine isn't any better off than he
was before; the three men that are thrown out of a job are very much
worse off.  But the cure isn't Socialism, [it's] for somebody to buckle
to and make a job for the three men.    Nevil Shute,  _Ruined City_

Chris Hibbert
hibbert at mydruthers.com
http://mydruthers.com



More information about the extropy-chat mailing list