[ExI] Are we too stupid? (Was: Cold fusion)

Tomaz Kristan protokol2020 at gmail.com
Mon May 27 18:16:09 UTC 2013

Here is my reply to spike as well.

I think, we are closer to the techno Singularity than almost anybody
thinks. We have this one tool, (super)computer simulations. Except for the
chaotic processes they are already quite good at - and are better every
day. In not so distant future, Apple will be able to afford simulating 1000
slightly changed version of its top product. Together with the production
lines or parts of the above two.

Somehow I don't think it will be Apple who will lead. It will be thousands
of firms which will have to go for this high speed rat race of computerized
re-innovations on monthly, weekly and then daily basis. We will not need to
wait to a next car model for months anymore. The metamorphosis of the
artifacts we know will be dramatically accelerated. This goes for chairs,
phones, planes, 3D printers, blocks of code ...

This is the way, the Singularity is quite near. Maybe not a Singularity
we've expected so long, but something what will swiftly become "the real


(This link contains the essence. One revolution (shale oil) just surprised
us, the next is on the horizon. )

On Mon, May 27, 2013 at 6:45 PM, Anders Sandberg <anders at aleph.se> wrote:

>  On 2013-05-27 15:25, spike wrote:
>  ** ** Look at us; how long have we been debating the singularity right
> here?  Twenty years now?  We aren’t any closer than we were then, other
> than computers are faster and better connected.  It is possible that humans
> are just slightly too dumb to discover how to cause a singularity.****
> This is actually a really good question. How smart do you have to be in
> order to trigger a singularity?
> Toby Ord pointed out that there is a curious coincidence in much of our
> thinking. We (handwavy use of us in the community who think that
> intelligence explosions are real possibilities) tend to think there is some
> limit - dogs will not invent singularity tech no matter how much time [*]
> we give them, yet many of us think there is some takeoff point near current
> human mental and technical capacities. This limit is presumably set by the
> laws of nature (in particular, the laws of computational complexity). Yet
> our current state is totally contingent - it is happening right now, and
> was not around in the past nor will it be in the future unless we manage to
> stagnate. So why are we assuming the takeoff point is near this tiny little
> window of capacity we are having right now? One could imagine Greek
> philosophers talking about Aristotle's "talking tools" and the progress
> over in Alexandria coming up with an intelligence explosion concept, yet
> clearly being far away from any takeoff points.
> Some possible answers might be that (1) the takeoff point is either far
> below or above us (see footnote below). (2) The question is badly posed, or
> the concepts used are verbiage. (3) there is an anthropic factor were
> beings who talk around singularities tend to be found just before them. (4)
> there is no such thing as a takeoff point. (5) we are living in an unusual
> era.
> [*] Give simple animal enough time to evolve in the right environment, and
> they may of course become intelligent, develop tech, and have a
> singularity. So framing the question right turns out to be really hard: how
> do we distinguish between waiting for natural evolution plus individual
> efforts as a result of it, having some resources and intelligence and using
> those, and other methods like making random trial-and-error? One possible
> answer to the question might simply be that it is wrongly posed: give
> enough hydrogen some time, and it will turn into singularity creatures. I
> suspect re-framing the question so it becomes well posed will be rather
> useful for improving our thinking about the topic.
> --
> Dr Anders Sandberg
> Future of Humanity Institute
> Oxford Martin School
> Oxford University
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20130527/5f694b3f/attachment.html>

More information about the extropy-chat mailing list