[ExI] singularity utopia's farewell

Adrian Tymes atymes at gmail.com
Thu Sep 16 17:26:31 UTC 2010


2010/9/16 Gregory Jones <spike66 at att.net>

>  We have been perplexed by the puzzling commentary by Singularity Utopia
>

I wasn't.  I've seen this category of fail far too often.  Here's roughly
what happened:

* SU saw something (the Singularity) that promised to make a lot of problems
go away.
(The particular something is irrelevant; people have made this mistake about
a number of
other things, with different justifications.)

* SU confused "a lot" for "all".  (Failure #1.  In short, There Is No God
Solution.  Some
things can be miraculous or at least very good, but if you think something
solves all
problems forever without costing anyone anything - not just "as compared to
current
solutions", some current set of problems, for a limited time, or for a
limited and
acceptable cost (including if the cost is acceptable because it is only
borne by others),
but literally everything forever - you can safely assume you've overlooked
or
misunderstood something.)

* Based on that incorrect data, SU logically decided that promoting it would
be the best
course of action.  (If there were some god-action that could fix all
problems with
practically zero effort, then yes, getting everyone to do it would be best.
Again: we
know the Singularity is not that, but SU believed it was.)

* SU went to an area perceived friendly to that something (this list).

* SU was informed of the realities of the thing, and how they were far less
than initially
perceived.  In particular, SU was informed that the thing would require a
lot of careful,
skilled work, not merely boundless enthusiasm

* SU experienced dissonance between SU's perception of the thing and what SU
was
being told.  In particular, SU perceived (likely not fully consciously) that
SU would have
to actually do some less-than-enjoyable work in order to achieve this end,
making
accepting this new truth less palatable.  (Letting such personal concerns
affect judgment
of what is real was failure #2.  Reality doesn't care if you suffer.)

* Knowing that there are many people generally resistant to change, SU
resolved the
dissonance by believing that even these supposed adherents must be more of
the
same, and therefore that anything they said could be dismissed without
rational
consideration.  (Failure #3: few people actually do personal attacks when
they see a way
to instead demonstrate the "obvious" benefits of and reasons for their
position.  Most
such disagreements are not about logic, but about the axioms and data from
which logic
can be done.)

* Having committed to that, SU then rationalized why we were "attacking"
that vision, and
ignored all further evidence to the contrary.  (Extension of failure #3.)

There is a sad irony in this case, because the principles of Extropy, as
defined by Mr.
More, include memes that defend against this category of error.  This
particular collection
of mental missteps is more common in, for example, politics.  (To be fair,
there are more
people in politics who will personally attack to back up ulterior motives,
but politics -
especially on large scales - often deals with situations so large that most
stakeholders
are honestly starting from very limited sets of data, perceiving parts of
the whole that
other stakeholders do not, and vice versa.)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20100916/77b6a60e/attachment.html>


More information about the extropy-chat mailing list