[Paleopsych] Wired: Kevin Kelly: Unto us the Machine is born
Premise Checker
checker at panix.com
Mon Nov 28 22:52:06 UTC 2005
Kevin Kelly: Unto us the Machine is born
http://www.smh.com.au/news/next/unto-us-the-machine-is-born/2005/11/14/1131816858554.html?oneclick=true
5.11.15
By 2015 the internet as we know it will be dead, killed by a globe-spanning
artificial consciousness, writes founding Wired editor Kevin Kelly.
THE web continues to evolve from an entity ruled by mass media and mass
audiences to one ruled by messy media and messy participation. How far can this
frenzy of creativity go? Encouraged by web-enabled sales, 175,000 books were
published and more than 30,000 music albums were released in the US last year.
At the same time, 14 million blogs were launched worldwide.
All these numbers are escalating. A simple extrapolation suggests that in the
near future everyone alive will (on average) write a song, author a book, make
a video, craft a weblog, and code a program. This idea is less outrageous than
the notion 150 years ago that some day everyone would write a letter or take a
photograph.
What happens when the data flow is asymmetrical - but in favour of creators?
What happens when everyone is uploading far more than they download? If
everyone is busy making, altering, mixing and mashing, who will have time to
sit back and veg out? Who will be a consumer?
No one. And that's just fine. A world in which production outpaces consumption
should not be sustainable; that's a lesson from economics 101. But online,
where many ideas that don't work in theory succeed in practice, the audience
increasingly doesn't matter. What matters is the network of social creation,
the community of collaborative interaction that futurist Alvin Toffler called
prosumption. As with blogging and BitTorrent, prosumers produce and consume at
once. The producers are the audience, the act of making is the act of watching,
and every link is both a point of departure and a destination.
But if a roiling mess of participation is all we think the web will become, we
are likely to miss the big news, again. The experts are certainly missing it.
The Pew Internet & American Life Project surveyed more than 1200 professionals
in 2004, asking them to predict the net's next decade. One scenario earned
agreement from two-thirds of respondents: "As computing devices become embedded
in everything from clothes to appliances to cars to phones, these networked
devices will allow greater surveillance by governments and businesses."
Another was affirmed by one-third: "By 2014, use of the internet will increase
the size of people's social networks far beyond what has traditionally been the
case."
These are safe bets, but they fail to capture the web's disruptive trajectory.
The real transformation under way is more akin to what Sun Microsystem's John
Gage had in mind in 1988 when he famously said: "The network is the computer."
His phrase sums up the destiny of the web: as the operating system for a
megacomputer that encompasses the internet, all its services, all peripheral
chips and affiliated devices from scanners to satellites, and the billions of
human minds entangled in this global network.
This gargantuan Machine already exists in a primitive form. In the coming
decade, it will evolve into an integral extension not only of our senses and
bodies, but our minds.
Today the Machine acts like a very large computer, with top-level functions
that operate at about the clock speed of an early PC. It processes 1 million
emails each second, which essentially means network ing runs at 100 kilohertz,
SMS at 1 kilohertz. The Machine's total external RAM is about 200 terabytes. In
any one second, 10 terabits can be coursing through its backbone, and each year
it generates nearly 20 exabytes of data. Its distributed "chip" spans 1 billion
active PCs, which is about the number of transistors in one PC.
This planet-sized computer is comparable in complexity to a human brain. Both
the brain and the web have hundreds of billions of neurons, or webpages. Each
biological neuron sprouts synaptic links to thousands of other neurons, and
each webpage branches into dozens of hyperlinks. That adds up to a trillion
"synapses" between the static pages on the web. The human brain has about 100
times that number - but brains are not doubling in size every few years. The
Machine is.
Since each of its "transistors" is itself a personal computer with a billion
transistors running lower functions, the Machine is fractal. In total, it
harnesses a quintillion transistors, expanding its complexity beyond that of a
biological brain. It has already surpassed the 20-petahertz threshold for
potential intelligence as calculated by Ray Kurzweil ("Human 2.0", Next 25/10).
For this reason some researchers pursuing artificial intelligence have switched
their bets to the net as the computer most likely to think first.
Danny Hillis, a computer scientist who once claimed he wanted to make an AI
"that would be proud of me", has invented massively parallel supercomputers, in
part to advance us in that direction. He now believes the first real AI will
emerge not in a stand-alone supercomputer such as IBM's proposed 23-teraflop
Blue Brain, but in the vast tangle of the global Machine.
In 10 years the system will contain hundreds of millions of miles of
fibre-optic neurons linking the billions of ant-smart chips embedded into
manufactured products, buried in environmental sensors, staring out from
satellite cameras, guiding cars, and saturating our world with enough
complexity to begin to learn. We will live inside this thing.
Today the nascent Machine routes packets around disturbances in its lines; by
2015 it will anticipate disturbances and avoid them. It will have a robust
immune system, weeding spam from its trunk lines, eliminating viruses and
denial-of-service attacks the moment they are launched, and dissuading
malefactors from injuring it again. The patterns of the Machine's internal
workings will be so complex they won't be repeatable; you won't always get the
same answer to a given question. It will take intuition to maximise what the
global network has to offer. The most obvious development birthed by this
platform will be the absorption of routine. The Machine will take on anything
we do more than twice. It will be the Anticipation Machine.
ONE great advantage the Machine holds in this regard: it's always on. It is
very hard to learn if you keep getting turned off, which is the fate of most
computers.
AI researchers rejoice when an adaptive learning program runs for days without
crashing. The foetal Machine has been running continuously for at least 10
years (30 if you want to be picky). I am aware of no other machine that has run
that long with no downtime. Portions may spin down because of power outages or
cascading infections,but the entire thing is unlikely to go quiet in the coming
decade. It will be the most reliable gadget we have.
And the most universal. By 2015, desktop operating systems will be largely
irrelevant. The web will be the only OS worth coding for. It won't matter what
device you use, as long as it runs on the web OS. You will reach the same
distributed computer whether you log on via phone, PDA, laptop, or HDTV.
By 2015 the '90s image of convergence will turn inside-out. Each device is a
differently shaped window that peers into the global computer. Nothing
converges. The Machine is an unbounded thing that will take a billion windows
to glimpse even part of. It is what you'll see on the other side of any screen.
And who will write the software that makes this contraption useful and
productive?
We will. Each of us already does it every day. When we post and then tag
pictures on the community photo album Flickr, we are teaching the Machine to
give names to images. The thickening links between caption and picture form a
neural net that learns.
Think of the 100 billion times a day humans click on a webpage as a way of
teaching the Machine what we think is important. Each time we forge a link
between words, we teach it an idea. Wikipedia encourages its citizen authors to
link each fact in an article to a reference citation. Over time, a Wikipedia
article becomes totally underlined in blue as ideas are cross-referenced. That
cross-referencing is how brains think and remember. It is how neural nets
answer questions. It is how our global skin of neurons will adapt autonomously
and acquire a higher level of knowledge.
The human brain has no department full of programming cells that configure the
mind. Brain cells program themselves simply by being used. Likewise, our
questions program the Machine to answer questions. We think we are merely
wasting time when we surf mindlessly or blog an item, but each time we click a
link we strengthen a node somewhere in the web OS, thereby programming the
Machine by using it.
What will most surprise us is how dependent we will be on what the Machine
knows - about us and about what we want to know. We already find it easier to
Google something rather than remember it. The more we teach this megacomputer,
the more it will assume responsibility for our knowing. It will become our
memory. Then it will become our identity. In 2015 many people, when divorced
from the Machine, won't feel like themselves - as if they'd had a lobotomy.
There is only one time in the history of each planet when its inhabitants first
wire up its parts to make one large Machine. Later that Machine may run faster,
but there is only one time when it is born.
You and I are alive at this moment. We should marvel, but people alive at such
times usually don't. Every few centuries, the steady march of change meets a
discontinuity, and history hinges on that moment. We look back on those pivotal
eras and wonder what it would have been like to be alive then. Confucius,
Zoroaster, Buddha, and the latter Jewish patriarchs lived in the same
historical era, an inflection point known as the axial age of religion. Few
world religions were born after this time. Similarly, the great personalities
converging upon the American Revolution and the geniuses who commingled during
the invention of modern science in the 17th century mark additional axial
phases in the short history of our civilisation.
Three thousand years from now, when keen minds review the past, I believe that
our ancient time, here at the cusp of the third millennium, will be seen as
another such era.
In the years roughly coincidental with the Netscape IPO, humans began animating
inert objects with tiny slivers of intelligence, connecting them into a global
field, and linking their own minds into a single thing. This will be recognised
as the largest, most complex, and most surprising event on the planet. Weaving
nerves out of glass and radio waves, our species began wiring up all regions,
all processes, all facts and notions into a grand network.
From this embryonic neural net was born a collaborative interface for our
civilisation, a sensing, cognitive device with power that exceeded any previous
invention. The Machine provided a new way of thinking (perfect search, total
recall) and a new mind for an old species. It was the Beginning.
Netscape's float was a puny rocket to herald such a moment. First moments are
like that. After the hysteria dies, the millions of dollars made and lost, the
strands of mind, once achingly isolated, come together.
Today, our Machine is born. It is on.
They couldn't have done it without you
The total number of webpages now exceeds 600 billion. That's 100 pages per
person alive.
In fewer than 4000 days, we have encoded half a trillion versions of our
collective story and put them in front of 1 billion people, or one-sixth of the
world's population. That remarkable achievement was not in anyone's 10-year
plan.
The accretion of tiny marvels can numb us to the arrival of the stupendous.
Today, at any net terminal, you can get: an amazing variety of music and video,
an evolving encyclopedia, weather forecasts, help-wanted ads, satellite images
of any place on earth - just to name a few applications - all wrapped up in an
interactive index that really works.
This view is spookily godlike. You can switch your gaze on a spot in the world
from map to satellite to 3-D just by clicking. Recall the past? It's there. Or
listen to the daily complaints and travails of almost anyone who blogs (and
doesn't everyone?). Ten years ago you would have been told there wasn't enough
money in all the investment firms in the world to fund such a cornucopia. The
success of the web at this scale was impossible. But if we have learned
anything in the past decade, it is the plausibility of the impossible.
In about 4000 days, eBay has gone from marginal experiment in community markets
in the San Francisco Bay area to the most profitable spin-off of hypertext. At
any one moment, 50 million auctions race through the site.
What we all failed to see was how much of this new world would be manufactured
by users, not corporate interests. Amazon.com customers rushed with surprising
speed and intelligence to write the reviews that made the site useable. Owners
of Adobe, Apple and most major software products offer help and advice on the
developer's forum web pages. And in the greatest leverage of the common user,
Google turns traffic and link patterns generated by 2 billion searches a month
into the organising intelligence for a new economy.
No web phenomenon is more confounding than blogging. Everything media experts
knew about audiences - and they knew a lot - confirmed the focus group belief
that audiences would never get off their butts and start making their own
entertainment.
What a shock, then, to witness the near-instantaneous rise of 50 million blogs,
with a new one appearing every two seconds. These user-created channels make no
sense economically. Where are the time, energy and resources coming from?
The audience.
I run a blog about cool tools. The web extends my passion to a far wider group
for no extra cost or effort. My site is part of a vast and growing gift
economy, a visible underground of valuable creations - free on inquiry. This
spurs the grateful to reciprocate. It permits easy modification and re-use, and
thus promotes consumers into producers.
The electricity of participation nudges ordinary folk to invest huge hunks of
energy and time into making free encyclopedias or creating public tutorials for
changing a flat tyre. A study found that only 40 per cent of the web is
commercial. The rest runs on duty or passion.
This follows the industrial age, by the way, when mass-produced goods
outclassed anything you could make yourself. The impulse for participation has
up-ended the economy and is steadily turning the sphere of social networking
into the main event.
Once, we, the public, just about only uploaded. Today, the poster child of the
new internet regime is BitTorrent, under which users upload stuff while they
are downloading. It assumes participation.
And the web embeds itself into every class, occupation and region. Everyone
missed the 2002 flip-point when women online suddenly outnumbered men. The
average user is now a 41-year-old woman.
What could be a better mark of irreversible acceptance than adoption by the
technology-reluctant American rural sect, the Amish?
On a visit recently, I was amazed to hear some Amish farmers mention their
websites.
"Amish websites?" I asked.
"For advertising our family business. We weld barbecue grills in our shop."
"Yes, but . . ."
"Oh, we use the internet terminal at the public library. And Yahoo!"
I knew then the battle was over.
Back to the future
Computing pioneer Vannevar Bush outlined the web's core idea - hyperlinked
pages - in 1945, but the first person to try to build on the concept was a
freethinker named Ted Nelson, who in 1965 envisioned his own scheme, which he
called "Xanadu". But he had little success connecting digital bits on a useful
scale and his efforts were known only to an isolated group of disciples. Few of
the hackers writing code for the emerging web in the 1990s knew about Nelson or
his hyperlinked dream machine.
At the suggestion of a computer-savvy friend, I got in touch with Nelson in
1984, a decade before Netscape made Marc Andreessen a millionaire. We met in a
dark dockside bar in Sausalito, California. Folded notes erupted from his
pockets, and long strips of paper slipped from overstuffed notebooks. He told
me about his scheme for organising all the knowledge of humanity. Salvation lay
in cutting up 3 x 5 cards, of which he had plenty.
Legend has it that Ted Nelson invented Xanadu as a remedy for his poor memory
and attention deficit disorder. He was certain that every document in the world
should be a footnote to some other document, and computers could make the
(hyper)links between them visible and permanent. He sketched out complicated
notions of transferring authorship back to creators and tracking payments as
readers hopped along networks of documents, what he called the docuverse. He
spoke of "transclusion" and "intertwingularity" as he described the grand
utopian benefits of his embedded structure.
It was clear to me a hyperlinked world was inevitable. But what surprises me is
how much was missing from Vannevar Bush's vision, Nelson's docuverse, and my
own expectations. The web revolution heralded a new kind of participation that
is developing into a culture based on sharing.
By ignoring the web's reality, we are likely to miss what it will grow into
over the next 10 years. Any hope of discerning the state of the web in 2015
requires that we own up to how wrong we were 10 years ago.
Originally published in Wired
More information about the paleopsych
mailing list