[ExI] MAX MORE in Second Life yesterday
kanzure at gmail.com
Fri Jun 13 22:38:53 UTC 2008
On Friday 13 June 2008, Michael Anissimov wrote:
> On Fri, Jun 13, 2008 at 3:17 PM, Bryan Bishop wrote:
> > <http://en.wikipedia.org/wiki/Technological_singularity>
> > > The technological singularity is a hypothesised point in the
> > > future variously characterized by the technological creation of
> > > self-improving intelligence, unprecedentedly rapid technological
> > > progress, or some combination of the two.
> Yes, this is a muddying-the-waters of the original definition, caused
> when Kurzweil took up the term "Singularity" and started defining it
> in all sorts of new ways that have nothing to do with the original
> definition. I guess I should just start giving up and calling the
> original definition "Vinge's Event Horizon".
I would agree that 'rapid techological progress' is muddy and doesn't
preserve as much, but when replaced with the idea of exponential or
accelerating growth, you get the same thing as with seedai scenarios,
since self-replication leads to recursive self-improving intelligence.
> > You still haven't proven to me ever since a few months ago when we
> > had our chat how superintelligence alone could bruteforce itself out
> > of a machine chasis without the interfaces and grounding that would
> > frankly have to exponentially grow in order to keep up with those
> > numbers.
> No, clearly it would need to acquire interfaces to the external
> world. These arguments can't be "proven" one way or the other -- I
> can only present the arguments and wait for your response.
Yes, but it's a bigger problem than just that. It's not only that it
needs interfaces to the external world, but that it needs to be an
embodied self-improving recursive process itself, otherwise it's just
an embodiment of the same constraints and limitations that we find
ourselves in, or the current restraints and limitations of current
technology. Sure, you could simulate a brain with a big enough
supercomputer using modern techniques, but that's only linearly
recursive and it will hit the fabricational barriers, just like we are
hitting them now (i.e., all of that information is locked up in
proprietary databases (or probably, people)), etc.
> I think it would benefit H+ as a whole if more transhumanists were
> aware of Vinge's Event Horizon, and how it differs fundamentally from
> Kurzweil's Singularity.
I don't know if it matters that much. We're all for the same fundamental
values anyway, but it just seems like we're not synchronized on the
same levels of thought here.
More information about the extropy-chat