<br><br><div><span class="gmail_quote">On 12/06/07, <b class="gmail_sendername">Eugen Leitl</b> <<a href="mailto:eugen@leitl.org">eugen@leitl.org</a>> wrote:<br><br></span><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
> So you would give a computer program control of a gun, knowing that it<br>> might shoot you on the basis of some unpredictable outcome of the<br>> program?<br><br>Of course you know that there are a number of systems like that, and
<br>their large-scale deployment is imminent. People don't scale, and<br>they certainly can't react quickly enough, so the logic of it<br>is straightforward.<br></blockquote></div><br>No system is completely predictable. You might press the brake pedal in your car and the accelerator might deploy instead, most likely due to your error but not inconceivably due to mechanical failure. If you were to replace this manual system in a car for an automatic one, you would want to make sure that the new system is at least as reliable, and there would be extensive testing before it is released on the market. Why would anyone forego such caution for something far, far more dangerous than car braking?
<br><br><br clear="all"><br>-- <br>Stathis Papaioannou