<div dir="ltr"><div class="gmail_extra"><div class="gmail_quote">On Thu, Jun 26, 2014 at 8:49 PM, Rafal Smigrodzki <span dir="ltr"><<a href="mailto:rafal.smigrodzki@gmail.com" target="_blank">rafal.smigrodzki@gmail.com</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div class="gmail_extra"><div class="gmail_quote">### The point discussed here is protection against dumb people. They won't be able to disable the self-destructs, and of course, before the software considers self-destruction, the previously mentioned defensive robots would have to be defeated, again, not easy for dumb, disorganized people.</div>
</div></div></blockquote><div><br></div><div>It is uneconomic to deploy defensive robots everywhere. If people trick you into thinking that you have to, they can defeat you just by causing you to overspend on defense while they spend instead on advancement. That's basically how the USA beat the USSR - at least, that was the final capstone.<br>
<br></div><div>If you don't fall for that, then by definition you've left weaknesses in your defenses. Enough that even low-skilled labor (and don't confuse that for disorganized: labor unions are all about organizing these sorts of folks, and have long experience battling against folks who wish to rule by owning the capital) could slip through and trigger the self-destructs.<br>
</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div class="gmail_extra"><div class="gmail_quote">
<div></div><div>Dealing with smart opponents is of course much more difficult but then smart people are a minority, and may be co-opted.</div></div></div></div></blockquote><div><br></div><div>Not if they come to believe that their odds are better opposing you. Like, say, if you would relegate them to a low-to-middle management position for life, never giving them a chance to rise, and they're smart enough to see through your promises to the contrary.<br>
</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div class="gmail_extra"><div class="gmail_quote">### Will they DIY enough robots to overcome my robots? Remember, they have no jobs to pay for the 3D printer feed.
</div></div></div></blockquote><div><br></div><div>So they steal it. Or find other means than 3D printing to make something good enough (or better: 3D printing is a generalized production technique, and can be outperformed in certain cases by older techniques, particularly in mass manufacturing of regular components - such as lots of the same weapon), and steal what they need for that.<br>
</div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div class="gmail_extra"><div class="gmail_quote"><p dir="ltr">> Technology never "wants" anything. It is ever but a tool.</p>
<div>### Here I strongly disagree. Technology is applied physics. It is discovered, not just made. Our desires interact with it but don't fully control it. Even simple tools don't always do what you want, and the whole space of technological possibilities constrains the shape of possible societies.The existence of a certain technological possibility has an impact on the society no matter whether the majority wants it or not. </div>
</div></div></div></blockquote><div><br></div><div>Only insofar as they are used to a certain end.<br></div><div> </div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
<div dir="ltr"><div class="gmail_extra"><div class="gmail_quote">
<div></div><div>If there is an easy and non-preventable method to make planetary destruction weapons in your basement, waiting to be soon discovered, then planets with even exceedingly small number of insane evil people would have a very short expected lifetime. In that universe, technology does not want people to be around.</div>
</div></div></div></blockquote><div><br></div><div>Anything with enough energy to destroy a planet is by definition not typical-basement-grade. But also note: despite the theoretical possibility of a humanity-eradicating plague that could have been unleashed over the past few decades - or, heck, global thermonuclear war and the ensuing nuclear winter - humanity is still around. Those with the smarts to use advanced weaponry have a very high correlation with those with the smarts not to actually use it in real situations, no matter what their bosses may demand.<br>
<br></div><div>The British protected their nuclear weapons with bicycle locks through at least the late 1990s - and they were never stolen nor improperly used in all that time: <a href="http://www.bbc.co.uk/pressoffice/pressreleases/stories/2007/11_november/15/newsnight.shtml">http://www.bbc.co.uk/pressoffice/pressreleases/stories/2007/11_november/15/newsnight.shtml</a><br>
<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div class="gmail_extra"><div class="gmail_quote"><p dir="ltr">> Besides, programming is not that hard a skill to pick up. Even now, some - not as many as could be, but some - out of work mid-career janitors and servers are retraining. Even in Freedonia.</p>
<div>### Only in "Superman III".</div></div></div></div></blockquote><div><br></div><div>And IRL. I happen to teach programming, from time to time. It has never been my experience that the basics are extremely hard for properly motivated people to learn. Granted, I have selected my students - those known to me quite well, and who I believe recognize they have a need to learn or that they would strongly benefit from learning - but "need to learn in order to defeat the invading robot army" would certainly meet the required level of motivation.<br>
</div></div></div></div>