[ExI] Autonomous car ethics

Dave Sill sparge at gmail.com
Sat Jun 25 00:49:58 UTC 2016


On Fri, Jun 24, 2016 at 3:44 PM, BillK <pharos at gmail.com> wrote:

>
> Yes. But the researchers are making the point that people won't buy /
> use a robot car that won't take all possible steps to protect the
> passengers.
>

And the first time a self-driving car plows through a pack of pedestrians,
Democrats in congress will stage a sit-in demanding that self-driving cars
avoid pedestrians.


> That must also be the cheapest / most profitable type of car to build
> as well. In the seconds available to analyse the situation, there is
> not enough time to make a value judgement of all possible
> consequences. Just sending a call for assistance and deciding how best
> to protect the passengers while minimising other damage is a difficult
> enough task


Driving a car is pretty damned difficult task. The seconds it takes an
accident to unfold are not enough time for humans to well-considered
decisions, but it's a relative eternitity for computer. Plenty to time to
choose colliding with a vehicle going in the same direction vs. a vehicle
coming the other way, or colliding with a low wall vs. an abutment.

My point is that self-driving cars will have to make these decisions, and
just because buyers want feature X doesn't mean buyers will get feature X.

-Dave
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20160624/f5ebe5fa/attachment.html>


More information about the extropy-chat mailing list