[extropy-chat] "The Singularity Myth"

Russell Wallace russell.wallace at gmail.com
Sun Mar 19 17:37:43 UTC 2006


On 3/19/06, Robert Bradbury <robert.bradbury at gmail.com> wrote:

> Samantha I believe is saying 50 years *max* (that is 2056) which is even
> beyond Robin's median date.  I think both are at the high end of the range.
>

If I understand Samantha correctly, she's not saying we will certainly have
Singularity within 50 years, but that we have 50 years of runway left - i.e.
that we'd _better_ get there within that time, or we may not get there at
all. (And my thoughts are similar, except I'd double the best guess for both
how long we have, and how long it'll likely take.)

Why?  Most singularity assumptions posit that robust nanotechnology is
> required.  I would point out that we *have* robust nanotechnology but call
> it biotechnology.  We have *today* single desk-sized machines that can take
> apart a bacterial genome in an afternoon (to add to the 300+ already in
> databases).  That is plenty of nanoparts to play with.  We have *today* two
> well funded companies working on synthetic genome assembly.  What is lacking
> by most people (excepting some like Rafal and myself who work or have worked
> in these areas) is a relatively good understanding of how much of the
> "matter as software" gold ring bionanotechnology can provide without the
> requirement dry diamondoid/sapphire based nanotechnology (DDSN).
>

In principle lots, but wet nanotech is hard to model (and full biological
systems even considerably harder) so progress has to be made by experiment,
which is slow. Still, progress is being made.


> 1. De facto world government forms, with the result that progress goes the
> > way of the Qeng Ho fleets. (The European Union is a disturbingly large step
> > on this route.)
> >
>
> This option doesn't work unless the "world government" actually imprisons
> everyone, esp. the few thousand wealthiest individuals on Earth.  By
> 2020-2030 people like Gates, Jobs, Ellison, the Google founders, should be
> able to exit stage left if governments (or conservative luddites) carry
> self-preservation too far.
>

You're an optimist, I can tell ^.^

The requirement for this would be nanotech (wet or dry) good enough for
permanent survival in space. I don't think we'll have that as early as
2030... but prove me wrong!

2. Continuing population crash renders progress unsustainable. (Continued
> > progress from a technology base as complex as today's requires very large
> > populations to be economically feasible.)
> >
>
> Requires that we completely lose the knowledge of basic biology scattered
> all around the world and the means to disassemble naturally evolved genomes
> and assemble synthetic genomes from them.  The "large population" argument
> doesn't get very far.  I believe the ratios are of the order of 10:1 and
> 50:1 for the DoD:NIH and NIH:Nanotechnology R&D budgets in the U.S.
> currently.  You could cut budgets significantly and continue progress as
> fast or faster if you restructured budget allocations.
>

There's truth in that, but you and I don't have the power to reallocate
budgets that way (if I had, I'd be doing it already). Perhaps things will
improve if more people start taking seriously the prospects for things like
healthy life extension. *Pauses for a tip of the hat, not only to the people
who are actually working on these things, but also to the people who are
working on presenting them in a positive way to the general public.*

3. Future political crisis leading to large scale war with nuclear or other
> > (e.g. biotech or nanotech) weapons of mass destruction results in a
> > fast-forward version of 2.
> >
>
> You have to have a human species extinction event with a complete loss of
> the current knowledge base for this to happen.  It is very difficult to
> accomplish this with nuclear war, biotech war, or nanotech war (or grey
> goo).  The only thing that might do it is an Armageddon (movie) like
> scenario.
>
> It is damn hard to create new knowledge, but once it has been created
> (think basic physics, chemistry, biology, medicine, etc.) and distributed
> (think Wikipedia, Google, millions and millions of web pages, millions of
> books, thousands of libraries, etc.) it is *very* hard to wipe it out.
>

*nods* I'm not concerned with physical survival of data so much - once the
printing press was invented, that hasn't been a big problem. The
social/political conditions that allow free thought and inquiry, though, are
an aberration on the broad scale of history, an intermittent flicker in one
corner of the world; there isn't any reason to suppose they'll persist
indefinitely, and the current trend towards polarization of political
thought between nihilistic strains of atheism and fundamentalist strains of
religion has me concerned that "not indefinitely" may end up being "not very
long".

Still, the nice thing about being a pessimist is, your surprises tend to be
pleasant ones ^.^
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20060319/e44201c5/attachment.html>


More information about the extropy-chat mailing list