[ExI] Yes, the Singularity is the greatest threat to humanity

Keith Henson hkeithhenson at gmail.com
Mon Jan 17 16:39:55 UTC 2011


On Mon, Jan 17, 2011 at 5:00 AM,  Anders Sandberg <anders at aleph.se> wrote:

snip

> This is why I think upload-triggered singularities (the minds will be
> based on human motivational templates at least) or any singularity with
> a relatively slow acceleration (allowing many different smart systems to
> co-exist and start to form self-regulating systems AKA societies) are
> vastly more preferable than hard takeoffs.

I agree.  However, we need to *deeply* understand evolved human
motivational templates and either modify them or keep the entities
with them out of certain phase spaces.

As I have often discussed here, there are psychological mechanisms
that switch humans into an irrational "war mode" when environmental
conditions are such that war is a better path for genes than the
alternative.

Further, I think we strongly biased to not understand our motives, in
fact will actively fight such understanding.

> If we have reasons to think
> hard takeoffs are even somewhat likely, then we need to take
> friendliness very seriously, try to avoid singularities altogether or
> move towards the softer kinds. Whether we can affect things enough to
> influence their probabilities is a good question.

Indeed.

> Even worse, we still have no good theory to tell us the likeliehood of
> hard takeoffs compared to soft (and compared to no singularity at all).
> Hopefully we can build a few tomorrow...

It's the Fermi problem again.

Keith



More information about the extropy-chat mailing list