[extropy-chat] The emergence of AI

Adrian Tymes wingcat at pacbell.net
Tue Dec 7 18:28:13 UTC 2004


--- Damien Broderick <thespike at satx.rr.com> wrote:
> At 03:15 PM 12/4/2004 -0500, Robin wrote to Eliezer:
> < Wouldn't it be worth it to take the time to
> convince at least one or two 
> people who are recognized established experts in the
> fields in which you 
> claim to have new insight, so they could integrate
> you into the large 
> intellectual conversation? >
> 
> I agree in general with Robin's comments, but offer
> this as a possible 
> counter-balance--a furious and frustrated essay by
> Physics Nobelist Brian 
> Josephson, concerning the institutional barriers to
> communicating 
> unorthodox ideas:
> 
>
http://www.tcm.phy.cam.ac.uk/~bdj10/archivefreedom/main.html

But consider the end of this:

> the distinction between 'nutty' ideas (which either
> have no scientific meaning or contain serious
> errors), which should be barred from the archive,
> and unusual ideas which may or may not be right, and
> also may turn out to be important, which should be
> allowed on the archive.

Inventing new terms instead of adapting existing
terms robs a thought of scientific meaning - that is,
meaning to other scientists, since they would not be
as able to comprehend it.  And there's also the bit
about the unspecified general improvements that no one
else has seen, which many would suspect of being an
error.

If many people are rejecting an idea, then no matter
how fiercely one believes in that idea, it's usually
worthwhile to see exactly what logic (and there's
always some logic) is causing the rejection.  At
worst, one can illustrate differences in basic
perceived facts.  (E.g., a rabid anti-choice protestor
disagrees strongly with most people on the relative
value of the potential life of an embryo vs. the
actual life of a human woman who happens to carry the
embryo, to the point that the moment of birth is
almost the sole point of value: if one dies after
being born, one has a nonzero amount of "life", which
is all that matters to the protestor.)  But most of
the time in this situation, one is likely to find at
least slight flaws in one's own ideas; once the flaws
are found, they can be refined away, leaving something
more likely to be perfectly in sync with reality.

This effect is so great, that the odds of success by
one who utterly rejects this approach (say, by blanket
rejecting/refusing to listen to all critics) are
nearly zero in almost any endeavor, as demonstrated
repeatedly throughout history.  The same seems likely
to hold true of efforts to develop Singularity-related
technologies, even given said technologies' unique
qualities.



More information about the extropy-chat mailing list