[extropy-chat] "The Singularity Myth"

Robert Bradbury robert.bradbury at gmail.com
Sun Mar 19 17:11:13 UTC 2006


On 3/19/06, Russell Wallace <russell.wallace at gmail.com> wrote:

> Samantha in this thread suggests 50 years; I think the world has a bit
> more inertia than she does, changes both good and bad being slower, so my
> guess would be 100.
>

Samantha I believe is saying 50 years *max* (that is 2056) which is even
beyond Robin's median date.  I think both are at the high end of the range.

Why?  Most singularity assumptions posit that robust nanotechnology is
required.  I would point out that we *have* robust nanotechnology but call
it biotechnology.  We have *today* single desk-sized machines that can take
apart a bacterial genome in an afternoon (to add to the 300+ already in
databases).  That is plenty of nanoparts to play with.  We have *today* two
well funded companies working on synthetic genome assembly.  What is lacking
by most people (excepting some like Rafal and myself who work or have worked
in these areas) is a relatively good understanding of how much of the
"matter as software" gold ring bionanotechnology can provide without the
requirement dry diamondoid/sapphire based nanotechnology (DDSN).  Many of
the key "nanotechnology" promises (cheap solar energy, feeding all of the
population in the world, extended longevity by decades, probably centuries)
are enabled by bionanotechnology and do not require DDSN.  Significant
advancements in our capabilities (intelligence amplification, automation,
robots, etc.) are enabled by current hard microelectronics trends thru 2015
-- not bionanotechnology, not DDSN, not "real" artificial intelligence
(whatever that is).  The only thing that you do not have from the complete
singularity picture are things like cryonic reanimations, ultra-high
bandwith connections between wet brains and the net, and those applications
such as nanobots or cheap space access which really require robust DDS MNT.

*All* of this is before 2020.


1. De facto world government forms, with the result that progress goes the
> way of the Qeng Ho fleets. (The European Union is a disturbingly large step
> on this route.)
>

This option doesn't work unless the "world government" actually imprisons
everyone, esp. the few thousand wealthiest individuals on Earth.  By
2020-2030 people like Gates, Jobs, Ellison, the Google founders, should be
able to exit stage left if governments (or conservative luddites) carry
self-preservation too far.

2. Continuing population crash renders progress unsustainable. (Continued
> progress from a technology base as complex as today's requires very large
> populations to be economically feasible.)
>

Requires that we completely lose the knowledge of basic biology scattered
all around the world and the means to disassemble naturally evolved genomes
and assemble synthetic genomes from them.  The "large population" argument
doesn't get very far.  I believe the ratios are of the order of 10:1 and
50:1 for the DoD:NIH and NIH:Nanotechnology R&D budgets in the U.S.
currently.  You could cut budgets significantly and continue progress as
fast or faster if you restructured budget allocations.

3. Future political crisis leading to large scale war with nuclear or other
> (e.g. biotech or nanotech) weapons of mass destruction results in a
> fast-forward version of 2.
>

You have to have a human species extinction event with a complete loss of
the current knowledge base for this to happen.  It is very difficult to
accomplish this with nuclear war, biotech war, or nanotech war (or grey
goo).  The only thing that might do it is an Armageddon (movie) like
scenario.

It is damn hard to create new knowledge, but once it has been created (think
basic physics, chemistry, biology, medicine, etc.) and distributed (think
Wikipedia, Google, millions and millions of web pages, millions of books,
thousands of libraries, etc.) it is *very* hard to wipe it out.

If you want an interesting project to protect humanity from backsliding,
talk to the founders of Wikipedia (or maybe the Long-Now folks) and get them
to start a project to put the "critical" human knowledge base in safe places
-- in several deep mines, in several submarine capsules sunk in the oceans,
on a rocket that lands on the moon, on the next rover that goes to Mars,
etc.  Then make a conscious effort to educate most humans on the Earth that
the human knowledge base is preserved and can be recovered should something
catastrophic happen.

One could even start it as a mini-ExI project.  Start with just the
Wikipedia text entries, then all first level text pages from Wikipedia
links, distribute them to the ExI members around the world EU, NA, AU, etc.
Continue to expand it as things like BluRay disks become available, etc.
This is feasible *now* (the current complete Wikipedia text should easily
fit on a single DVD) [1].

Robert

1. http://en.wikipedia.org/wiki/Wikipedia:Size_comparisons
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20060319/087e9880/attachment.html>


More information about the extropy-chat mailing list