[ExI] The Coming War on General Computation

Tomasz Rola rtomek at ceti.pl
Sat Jan 7 04:47:09 UTC 2012


On Wed, 4 Jan 2012, Ben Zaiboc wrote:

> Transcript of Cory Doctorow's talk, 'The Coming War on General Computation':
> http://joshuawise.com/28c3-transcript
> 
> Video can be found at 
> http://boingboing.net/2011/12/27/the-coming-war-on-general-purp.html
> 
> We now risk a world where no-one is allowed to make an MPU without 
> built-in mechanisms for surveillance and control.  General-purpose 
> computing would become a thing of the past, and anyone who tried to 
> build one would be a criminal.  Free software, of course, would be dead 
> in the water.
> 
> I'm now wondering if this counts as an existential risk.

If I understood this (I've read a transcript) the problem is about the 
fact that making a device that can only execute authorized code is close 
to impossible - and if such device was made it would be close to useless, 
IMHO.

So it is (close to) impossible to make a computer that could not help in, 
say, pirating the content.

Even if it would be possible, technology makes it possible at the same 
time to do many creative things with any kind of hardware out there, thus 
any "clever" and "sophisticated" "copy write" securing scheme is being 
broken (and it can be seen almost going live, wise guys are this quick).

Basically it all goes down to the fact that everybody can have a general 
purpose computing device for a price of dirt. I speak about some cheap 
8-bit thing that requires some knowledge of electronic to be of any use 
(how and what to solder etc), but even with such simple thing one can do 
possibly a lot of hacks. There are also tons of old but still working 
hardware that can be hacked for few decades.

With this on a desk, one can think of a lot of more complicated things.

The man is right, however, that media owners are small barkers compared to 
some gorillas who are only starting to wake up.

But, to help them would mean removing the technology from under our feet.

This doable, but if done would nullify a lot - basically, moving us all to 
19 century at best and putting works of Alan Turing and Alonso Church on 
index (and everything else that stemmed from them). Oh, I forgot about 
Charles Babbage. So, we have to hop somewhere into 1830s... No, even 
worse. We would have to censor Joseph Marie Jacquard's mechanical loom and 
punched cards. This moves us back some 50 years, to around 1780s. No, 
again, we have to consider a man named Gerbert d'Aurillac, later known as 
Pope Silvester II, also known as mathematician and constructor of 
hydraulic-powered organ. After he learned logic, geometry, 
astronomy/astrology and other such stuff from Arabs while in Spain, he was 
said to posess an automaton that could give yes/no answer when asked a 
question. This was probably only a legend, but that there was such idea in 
circulation 1000 years ago is mind boggling for me.

Oh crap. To remove computation we would have to be rewound back to 9th 
century.

No, again. From what I have heard, ancient Greeks had simple automatons. 
They used them on a daily basis - like, automaton that gave water after 
being given some amount of money. Or automaton used for animating things 
in theatre. Those guys were really good, probably on the verge of 
inventing steam turbine. O yeah, and they made mechanical calculators with
bronze gears - the one and only such calculator found in Antikythera is 
said to be from 1st century BC, but since it is soo good and well made, 
this must have been later design and not prototype. All this was 2000-2500 
years ago.

So, I wouldn't be surprised at all if there was more of such inventions in 
the past. Now buried under soil or burnt in fire by some warrior-king who 
was in need of more bronze.

Now, I think the real problem is not general purpose computing but 
mathematics. Because most of this computing comes straight from 
mathematicians - like Muhammad ibn Musa al-Khwarizmi, Alonso Church, Alan 
Turing... For summary and even more names - like Heron of Alexandria, who 
is said to have built a programmable cart, see here:

http://en.wikipedia.org/wiki/History_of_computing_hardware

So, overally, one has to come to this idea that scientists and engineers 
are the worst enemies of technologically oriented corporations. And those 
very same corporations have to rely on them or perish. Interesting 
situation.

Perhaps it is possible to have a very small population, say 5-10 millions, 
of technologically savvy folks, sitting in some isolated area and having 
all tech in their control, while the rest of the planet remains as 
technological reservation with folks eating raw meat and whatever they 
find. Perhaps it is even possible to transit from current state of thing 
to this nasty scenario (I can imagine this but really, nothing 
interesting). BTW, speaking of Greeks, I don't remember where I have read 
about it but once upon a time Hephaestus built a tripod automatons and few 
other things for his fellow gods, who were quite dependent on him - to 
such extent that after he was exiled from Olympus they went after him 
begging for his return. The very same gods went freak after Prometheus 
stole fire (maybe from Hephaestus' workshop) and gave it to mortal people.

Anyway, back to your question - there is some risk, but from my point of 
view anybody trying to solve this "problem" would do a big disservice 
first to people but later to himself, too. It's more like a house of 
cards. One can remove cards, sure. The house will, however, collapse. And 
it will be rebuilt, possibly up to a point when people's creativity will 
scare the shit out of humanity managers. And so on.

Is it wise to try building house of cards without cards - no, if you asked 
me. If things turned out to be like this, I wouldn't be very much 
surprised however (that's just very typical human irrationality).

On the other hand, maybe things will go in some more positive direction. 
Even Cory Doctorow made the remark that current technological progress 
puts us all on a fast lane, going somewhere - but obiously not much can be 
said about where the journey ends, or if it ends at all. The only certain 
knowledge about motion is motion itself, it seems. Unless we crash, as it 
happens to some fast movers.

For now, we have universal computer that can be told to compute all kind 
of numbers, including numbers that unlock "copy writes".

Next programme, we will have universal constructor, able to fabricate 
medicines to heal us and guns to shoot those already healthy enough to be 
killed. The introduction of universal constructor, or something less 
universal but still quite capable, might be postponed to some extent but 
will be very hard to stop - house of cards, mind you.

The most important existential risk for me is the problem of human 
irrationality. And I am not even sure it can be somehow helped. Even with 
all technology humans can make, now and in the future.

However, now taking my more optimistic point of view, history might be 
seen as fight between irrationality and creativity. So far, creativity 
wins or at least buys itself more time, throwing toys and meat to 
irrationality. As long as irrationality is fed and entertained, we can 
survive.

Regards,
Tomasz Rola

--
** A C programmer asked whether computer had Buddha's nature.      **
** As the answer, master did "rm -rif" on the programmer's home    **
** directory. And then the C programmer became enlightened...      **
**                                                                 **
** Tomasz Rola          mailto:tomasz_rola at bigfoot.com             **



More information about the extropy-chat mailing list