[ExI] Progress curves

Amara D. Angelica amara at kurzweilai.net
Mon Jul 11 12:00:33 UTC 2011


Posted here:
http://www.kurzweilai.net/thinking-quantitatively-about-technological-progre
ss -- comments welcome

-----Original Message-----
From: extropy-chat-bounces at lists.extropy.org
[mailto:extropy-chat-bounces at lists.extropy.org] On Behalf Of Anders Sandberg
Sent: Sunday, July 10, 2011 1:27 AM
To: ExI chat list
Subject: [ExI] Progress curves

I have been thinking about progress a bit recently, mainly because I 
would like to develop a mathematical model of how brain scanning 
technology and computational neuroscience might develop.

In general, I think the most solid evidence of technological progress is 
Wrightean experience curves. These are well documented in economics and 
found everywhere: typically the cost (or time) of manufacturing per unit 
behaves as x^a, where a<0 (typically something like -0.1) and x is the 
number of units produced so far. When you make more things you learn how 
to make the process better.
http://en.wikipedia.org/wiki/Experience_curve_effects

On the output side we have performance curves: how many units of 
something useful can we get per dollar. The Santa Fe Institute 
performance curve database http://pcdb.santafe.edu/ is full of 
interesting evidence of things getting better/cheaper. Bela Nagy has 
argued that typically we see "Sahal's Law": exponentially increasing 
sales (since a tech becomes cheaper and more ubiquitous) together with 
exponential progress produces Wright's experience curves: 
http://192.12.12.16/events/workshops/images/4/4f/Nagy.ModelingOrganizational
Complexity.pdf
http://tuvalu.santafe.edu/~bn/workingpapers/NagyFarmerTrancikBui.pdf

One interesting problem might be that some techs are limited because of 
the number of units sold will eventually level off. In sales of new 
technology we see Bass curves: a sigmoid curve where at first a handful 
of early adopters get it, then more and more get it (since people copy 
each other this is roughly exponential) and then a leveling off as most 
potential buyers already got it. Lots of literature on it, useless for 
forecasting (due to noise sensitivity in the early days). If Bela is 
right, this would mean that a technology obeying the Moore-Sahal-Wright 
relations would certainly follow a straight line in the "total units 
sold" vs. "cost per unit" diagram, but there would be a limit point 
since the total units sold eventually levels off (once you have 
railroads to every city, building another one will not be useful; once 
everybody has good enough graphics cards they will buy much fewer).

The technology stagnates, and this is not because of any fundamental 
physics or engineering limit. The real limit is lack of economic 
incentives for becoming much better.

Another aspect which I find really interesting, is whether a field has 
sudden jumps or continuous growth. Consider how many fluid dynamics you 
can get per dollar. You have an underlying Moore's law exponential, but 
discrete algorithmic improvements create big jumps as more efficient 
ways of calculating are discovered. Typically these improvements are 
big, a decade of Moore or so. But this mainly happens in some fields 
like software (chess program performance behaves like this, and I 
suspect - if we ever could get a good performance measure - AI does too) 
where a bright idea changes the process a lot. It is much more rare in 
fields constrained by physics (mining?) or where the tech is composed of 
a myriad interacting components (cars?)

Any other approaches you know of in thinking quantitatively about 
technological progress?

-- 
Anders Sandberg
Future of Humanity Institute
Oxford University




More information about the extropy-chat mailing list