[ExI] Are we too stupid? (Was: Cold fusion)
Anders Sandberg
anders at aleph.se
Sun Jun 2 22:06:40 UTC 2013
On 2013-06-02 15:20, Stefano Vaj wrote:
> On 27 May 2013 18:45, Anders Sandberg <anders at aleph.se
> <mailto:anders at aleph.se>> wrote:
>
> This is actually a really good question. How smart do you have to be
> in order to trigger a singularity?
>
>
> Let us give a more Nietzchean turn: how much WILL do you have to summon
> up to trigger a singularity?
Well, while I think you have an interesting question, it is also
somewhat limited. Without the right preconditions you cannot do it, and
likely cannot even make the right preconditions no matter what your will
is. After all, a dog that truly Wants a singularity will likely (1) have
an erroneous concept of what it is, and (2) cannot even conceptualize
what he should do to get it, let alone do it. We might argue that humans
are smart enough to actually will something eventually real, but that is
basically my original question.
> OTOH, a singularity, a Zeit-Umbruch can well be identified in
> catastrophic changes of status, in paradigm shifts, in radical
> breakthroughs changing the very nature and sense of our "being in the
> world".
In my sketch for a paper on the thread subject, I was looking at a model
of intelligence enhancement. The vanilla scenario of something/someone
being able to improve its own intelligence/ability to improve leading to
a runaway growth can be expressed as iterating a function: f(x) denotes
how smart you can make yourself if you have intelligence x, so
x_{n+1}=f(x_n) describes a simplified dynamics across time. This is a
well understood area of math, and the behavior if f(x) is monotonously
increasing is pretty simple: either you cannot improve (f(x)<=x) or you
can (f(x)>x), in which case you will now move off towards the next
largest fixed point f(x)=x. So from a large perspective it looks like
the intelligence increases slowly (due to evolution, random drift,
waiting for some smart enough to come up with the trick) until it
reaches a critical threshold where it makes a jump to a higher level.
Very much a paradigm shift, although in this case inside a simplistic
model rather than with the full complexity of how being in the world
changes.
> Now, I have been persuaded since the eighties that we do stand
> on the verge of one such metamorphosis.
What persuaded you? This is what I think is the interesting question:
what would provide plausible evidence for a singularity? Or even
imminent conceptual breakthrough?
Consider making a guess at when the next Michael Bay blockbuster action
movie will happen versus when somebody will prove/disprove the Riemann
conjecture. One seems to be far, far more predictable than the other.
Of course, one can "cheat" by simply making self-overcoming popular,
which I guess is the proper Nietzschean approach. But I find the
question about what deduction/induction can tell us about cultural
changes interesting too.
> Transhumanism for me is basically the ideology of those embracing change
> and self-overcoming as a fundamental ethical option, and offer that as
> an alternative to the humanist biased.
Amen to that.
--
Anders Sandberg
Future of Humanity Institute
Oxford University
More information about the extropy-chat
mailing list