[ExI] Definitions of the singularity (was: re: MAX MORE in Second Life yesterday)
Bryan Bishop
kanzure at gmail.com
Fri Jun 13 02:08:50 UTC 2008
On Thursday 12 June 2008, Michael Anissimov wrote:
> Also, some people seem to be under the mistaken impression that the
> Singularity is necessarily a worldwide event that touches everyone.
> As initially defined, it meant smarter-than-human intelligence. So
> you could have a smarter-than-human intelligence in Antarctica that
> just sits around and has no impact on the world whatsoever. Until
> the initial, Vingean, most useful definition of the Singularity, that
> would constitute one, but under the new, messy, overbroad,
> Kurzweilian definition, it wouldn't.
Hrm. I'm pretty sure that you're doubly hijacking the singularity
definition. It's either/or:
(1) Recursive self-improving intelligence a.k.a. superintelligence.
or (2) Exponential growth.
From #1 comes #2 (theoretically ;-), and from #2 comes #1.
- Bryan, who is just nit-picking at the moment.
________________________________________
http://heybryan.org/
More information about the extropy-chat
mailing list