[extropy-chat] Turbulence of obsolesence (was: Anti-virus protection -- problem fixed!)

Adrian Tymes wingcat at pacbell.net
Tue Apr 19 21:23:53 UTC 2005


--- Eugen Leitl <eugen at leitl.org> wrote:
> On Mon, Apr 18, 2005 at 01:46:18PM -0700, Adrian Tymes wrote:
> > On the contrary.  Ignoring criminal hackers and other agents of ill
> > (who would be applauded if they resigned en masse without
> > replacements), how many of our technically capable people have
> > settled for a professional life of cleaning up Microsoft's messes?
> 
> Most of IT professionals make a living feeding on detritus from
> Redmond.
> (Admittedly, professional coprophagia is an acquired taste. Uck.
> Ptui).

My impression is that that overstates things.  Yes, there is a large
and thriving portion of IT that is based on fixing Microsoft's
problems, but I have worked for many companies whose core products or
services had nothing to do with Microsoft's offerings, except maybe
that sometimes they worked on Windows too.  For example, Web services:
if you design to be non-browser-specific, which most sites do (although
a significant minority build for MSIE only - usually only for a handful
of months, until customer complaints point out what a bad idea that is
and they have time to redo to be browser-neutral), then you simply
don't care what OS the people trying your site are using.  (My present
employer, http://www.teleo.com/ , is an example of this.)  Or take
most supercomputing applications (mostly Unix and variants), or most
smaller-than-desktop apps (embedded Linux or Palm and derivatives
dominate here, to my knowledge).

Microsoft rules the desktop computing industry.  That's a far cry from
all of IT.

> > What kinds and quantities of good could they do, if that need were
> not
> > there and their labors free to serve other industries?  'Tis like
> the
> 
> This assumes the people so occupied would be capable of serving 
> other industries, and not be out of job.

I believe that most human beings have the human trait of being
adaptable to changing circumstances.  Granted, it might be socially
wise to tap the profits of new tech to retrain workers made obsolete by
the new tech, and perhaps provide temporary unemployment benefits while
they are being retrained (limited to limit abuse), once the new tech
gains enough momentum that significant numbers of workers are being
displaced (so as not to kill new tech before it gets going), but people
these days can and do have more than one career over the course of
their lives.  (And that's before we have radical age extension or
immortality.)

> > buggy whip manufacturers on the eve of the Model T, whose leather
> and
> > labors subsequently went on to find other uses that could not
> > profitably be served while buggy whips were needed.
> 
> What do you think would be such budding industries, and why would
> displaced
> IT people have suitable skills (not all of them are trainable,
> assuming
> there's money and incentive to pay for training) to be employable in
> those 
> local budding industries?

These days, to work is to be trainable.  Refuse to accept training as
one's job changes to require it, and one's skills quickly rot away
anyway, causing one to join the ranks of the unemployable.  Retarding
the progress of technology to keep these people in jobs harms the rest
of society by denying everyone the benefits of the new ways.  Besides,
old needs rarely go away completely (there are still whip manufacturers
today), so those who absolutely can not learn new skills can compete
for the few remaining jobs while the rest of us move on.

Again, given the inevitable shortage of labor in new industries (it's
new, so there isn't already a wide pool of labor trained in the new
industry's particulars), using the profits (once there are significant
profits) to give the displaced the necessary skills seems like the best
generally applicable solution.

> > Yes, there would be significant short-term economic dislocations,
> big
> > enough to strain our social safety nets.  But imagine the computing
> 
> I'm living right in the middle of an economic dislocation, and given
> that it's
> in its second decade it's not that short-term. Prospects are pretty
> dismal.

Which dislocation are you referring to?  If you mean the dot-bomb
crash, the industry's pretty much recovered by now.  Stock prices might
not be at the bubble's high, but they are certainly higher than they
were before the bubble started.  There might not be $100K salaries for
people who could crank out a basic HTML page, but those people are
employable - if they are willing to accept that that skill isn't
actually all that hard (which it never was; there was just a temporary
scarcity which drove up prices) or are willing to learn new tricks.
Every case I've looked into of a technical worker who "can not" find
work, is someone whose skills weren't that advanced, who refuses to
acknowledge the current existance of lower-paying jobs for basic
skills, and who won't develop their technical skills to meet current
demands.  (Although it is only partially their fault: there are also a
lot of companies who "can not" find skilled technical workers, who
could find a number of skilled workers if they'd just up their offered
salaries a bit and/or otherwise adjust to the modern realities of
telecommuting et al.  This doesn't entirely excuse the workers,
though.)

If you mean another dislocation, I'd have to know which before knowing
if I know anything about it.  (It could help prove my point, it could
disprove it; I don't yet know.)

> > applications that would become feasable if you really could trust
> > stnadard personal computers.  Imagine the collapse in bandwidth
> prices,
> 
> Who would be paying to write these applications? Such talent is rare,
> and
> already well accounted for (but in the developing countries).

You mean "but not in the developing countries", right?

The talent might be rare, but talent can be nurtured and developed.
That would likely be part of retraining in this case.  Innate genius,
in most cases, is not actually impossible to duplicate, merely
impractical by the standards of the time - and standards can change.
(Until they do, though, there is often little real difference between
"impossible" and "impractical", thus the two get confused a lot.)

Imagine, for instance, a blacksmith in the early days of metalworking,
who put a year into study of his craft.  His works would be considered
art by those around him.  Now imagine if he were time-teleported a
thousand years* into the future, where most blacksmiths apprenticed for
a year or more before being considered worthy independent operators.
After adjusting for culture shock, he would find his skills merely
adequate.

* As adjusted for the slow rate of change back then.  The modern
equivalent would be about 20-40 years, maybe less, certainly within a
normal professional career.  (Of course, the time-teleport would still
prevent someone from building up 20 years of professional experience,
which natural buildup is why we don't see this problem much more than
we do today.)

> > and subsequent high availability of bandwidth for everyone, if DDOS
> > attacks and spam became mostly historical footnotes.  (And I'm sure
> we
> 
> DDoS and spam have about zero impact on the traffic cost. ISPs are
> well-equipped to deal even with surging traffic due to P2P, given the
> postdotcombomb overcapacity. 

Last I'd heard, DDOS and spam account for about 2/3rd of an ISP's
typical bandwidth costs.  Yes, bandwidth is at a low price due to
overcapacity, but costs are costs, and the overcapacity would remain if
DDOS and spam went away.

> > But still, a problem we wished we have (so we could enjoy the
> things
> > that come with the problem) is still a problem, and problems
> generally
> > need solving.  And this problem has a more generic form that we
> will
> > face, if our dreams come to pass - and it is a pretty large one.  I
> > wonder, is there a useful way to break down the problem of
> > transitioning workers and investments, once they have been
> displaced by
> > new technologies, into other markets - including and especially
> ones
> > made possible, or at least profitable, by these same new
> technologies?
> 
> You'd do well by identifying these technologies first.

Actually, I was hoping for an approach that would be broadly applicable
to most technologies.  Identifying things only for a few specific techs
doesn't necessarily lead to that, as opposed to identifying examples
for specific techs of broad patterns that could apply elsewhere.

> Right now, in
> the old
> industrial countries there aren't any. Automation is releasing lots
> of people
> into the unemployed pool, and we haven't even started yet (financial
> and postal
> is hemorrhagic heavily, and logistics is next).
> 
> Add AI and robotics, and it truly hits the fan.

Ironic that you should mention those.  I'd call applied AI (not basic
research into cognition, but actual products and services that can
easily prove their immediate financial worth, like trend analysis
software when it works) and advanced adaptable robotics (like the
Asimo, or rescue & military drones that can largely carry out their
operations with nothing more than guidance from home base and a home
base to return to afterwards) examples of new technologies that could
benefit from having a lot more people working on them.

There's also advanced nanofabrication - not just the bulk materials
that are the typical "nanoproducts" of today, but actual functional
machines and electronic components.  I'm working on building one such
device myself, and I seem to be inventing my own processes to a degree
that you'd never have to do in a mature industry.  (That, or I'm
reinventing the wheel - but those I talk to in detail about this
project, who would know about existing wheel-equivalents, haven't
pointed them out to me, so it appears unlikely they're already out
there.)  I'm doing my best to document my processes so that, if my
experiment is successful, it can be replicated elsewhere - but the
highly custom state of equipment at the lab I'm doing this at, and the
probable equally custom state at other labs that could replicate the
experiment, means that any such replication would be difficult to the
extreme.  (It'd be significant work just to replicate it at the same
lab, using the same equipment with the same settings and the same
materials...and any such replication could be suspected of being due to
undocumented flukes of the lab's equipment, among other possibilities.)
But that might be a little too advanced, given the R&D that has to be
done before it can be pushed into products on the store shelf.



More information about the extropy-chat mailing list