[ExI] keynes vs hayek again, was: RE: 3d printers for sale
Charlie Stross
charlie.stross at gmail.com
Tue Aug 28 11:27:50 UTC 2012
On 28 Aug 2012, at 04:06, Anders Sandberg <anders at aleph.se> wrote:
>
> On 28/08/2012 07:18, spike wrote:
>> Car-drones are a big threat. It might be hard to combat such a weapon.
>
> Current society is very vulnerable to autonomous car-bombs (or my earlier sketch of terrorist drones). That doesn't mean it will remain vulnerable if they start to occur. It is a bit like computer security: it was nonexistent until needed. But continuing that analogy, if the incentives or popular solutions turn out to be bad, then we might get a suboptimal situation, like in computer security (in fact, bad computer security together with robot-like devices makes all such devices potentially malicious).
>
> What would a *good* solution to the car-bomb problem be?
As with all security threats, a *good* solution is to pre-empt the incentives that generate such a threat. Lone wolf individuals with a grudge, such as Timothy McVeigh -- and the odd psychotic genius, such as Anders Breivik -- are perhaps inevitable in any complex society, but to do serious damage to a society takes more than one person: it takes an organized group with a recruiting base, and such groups usually nucleate around causes where there is huge injustice perceived by a much larger population.
A good start would be to work on identifying such potential flashpoints and defusing them before they generate radicals who are willing to take up arms. But that's a political problem, and possibly a structural problem for any government that attempts to administer a diverse population. Worse, trying to solve it requires compromise with radical minority political viewpoints, which in turn may be seen as weakness or immorality or whatever by other groups (including the political party or coalition running the government).
And that still doesn't buy us a solution to state-level actors engaging in this kind of strategy.
It's one reason why I consider drone strikes to be a reprehensible and very dangerous form of warfare. They're basically assassination weapons. And assassination is *cheap*. It lowers the threshold of violence and makes it possible for poor governments to play on a level playing field with world powers, which in turn generates more flashpoints and ultimately causes chaos. (Never mind that the current US drone policy in Afghanistan includes attacking people who go out to retrieve the bodies of those killed in earlier strikes -- arguably a war crime -- and attacking funerals and other public gatherings. In a part of the world where the blood feud is a common social pattern! Which means those missile strikes are a potent recruiting tool for the Taliban. Madness, right?)
> Transparency is only good enough if it prevents bad things before they happen, so we need very proactive monitoring.
> Since a car already is a deadly device (just get it to drive fast to crash) it is not clear that it could see a hack attack coming: for transparency to fully work we need to have solved the computer security problem. Something like capability approach might be more promising: there are built in and *hardwired* safety rules (like "don't hit pedestrians", "if loaded by more than X kg of something, running with no driver, and approaching a federal building, allow a surveillance sweep in a sepaarate garage"), and to participate in the traffic system a car has to prove cryptographically that it follows them ("trusted commuting"). Loads of implementation and introduction problems of course.
No.
More like: automated ubiquitous vehicle sensor networks in all roads so that all operational vehicles, manned or automated, are identified and tracked at all times. And a mandatory fail-safe police despatch controlled cut-off for automated vehicles: as in, UNLESS an AV has permission to operate, it WILL shut down immediately and be flagged up in a control room. Yes, normally all AVs will have permission to operate -- but it needs to be a system whereby AVs can only operate by permit, and if an attempt is made to disable or spoof the automated remote cut-off, that needs to generate an urgent police call. ("Someone is feloniously trying to turn vehicle XYZZY into a potential getaway car or car bomb. Investigate immediately: caution advised.")
And even that isn't going to prevent the Provisional IRA 1970's intelligent car bomb algorithm (which I believe is currently being used in Syria):
1. Find someone who works for the British government (e.g. a builder who works on police stations, or a fast food vendor who sells to soldiers)
2. Take their family hostage at gunpoint.
3. Put them in their own car. Tell them to drive to a police station. Tell them they're being watched, and if they don't go where they're told they'll come home to find their family in pieces.
-- Charlie
More information about the extropy-chat
mailing list