[extropy-chat] Still confused:)

Robert Bradbury robert.bradbury at gmail.com
Wed Aug 30 17:57:28 UTC 2006


On 8/30/06, Michael Anissimov <michaelanissimov at gmail.com> wrote:

> It has little to do with
> technological progress, but rather smarter-than-human intelligence.
> Even if technological progress were slowing down, we could still have
> a Singularity, because it's fundamentally a *cognitive* advance, not
> related to technology-in-general except insofar as it would be a
> specific technological advance with cogntive results.


Actually Michael, I disagree strongly with the statement that the
Singularity is a result or must be based upon "cognitive advances" [1].  It
is related primarily to population, knowledge and wealth, which in turn
depend upon the technology at ones disposal.  The progress does not involve
any significant changes to human cognition which has existed for the last
several thousand years. [2]

The increase in population gives us greater aggregate computational power.
An increase in knowledge and accumulated wealth allows us to live longer,
healthier lives and devote an increased fraction of that knowledge and
wealth towards increasing knowledge and accumulating more wealth.  That
knowledge reached the tipping point sometime in the last 40 years -- whether
you view it in the invention of the transistor, the IC, the microprocessor,
the Web, or Google (or some combination of these) [3].  At that point the
growth of knowledge shifted significantly above the population growth rate,
thus accelerating us the "natural" March to the Singularity rate.

The *only* part of the Singularity that requires cognitive enhancement is
the part of it which requires progress or insight which exceeds the capacity
of a single human mind or a groups of well educated minds.  For almost
anything seen in the "transhumanist" vision (lifespan extension, posthuman
bodies, robust molecular nanotechnology (the Feynman/Drexler vision), living
for "free" (Sapphire Mansions), relatively unlimited energy, solar system
development (Matrioshka Brains), galactic colonization (Far Side Parties),
mind uploading, cryonic reanimation, etc. it can be done *without* advanced
AI.  The only thing that advanced AI *might* (and I stress might) do is
accelerate the Singularity progress rate.

Robert

1. Unless your emphasis on "cognition" is weighted heavily towards the
"knowledge" (information) aspect.  Entire fields such from archeology to
paleontology are based on our ability to develop ways to think about raw
information.  It is the information and our innate ability to think about it
which important.
2. One could view the "tipping" point for the start of the Singularity as
the mutations in a few human genes which promoted human migration,
exploration and perhaps aggression.  They tipped  the scales so as to allow
humans to populate the world and avoid single region extinctions due to the
natural hazard function.
3. One could extend this backwards to things like the Library of Alexandria
or even clay tablets used in Babylon (which enabled the replication of
knowledge allowing it to survive the death of single humans or tribes of
humans).
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20060830/072417a7/attachment.html>


More information about the extropy-chat mailing list